[Search for users] [Overall Top Noters] [List of all Conferences] [Download this site]

Conference turris::languages

Title:Languages
Notice:Speaking In Tongues
Moderator:TLE::TOKLAS::FELDMAN
Created:Sat Jan 25 1986
Last Modified:Wed May 21 1997
Last Successful Update:Fri Jun 06 1997
Number of topics:394
Total number of notes:2683

103.0. "What makes a high-level language?" by VOGON::HAXBY (John Haxby -- Definitively Wrong) Fri Sep 19 1986 11:49

    I decided to create a new note here rather than add to 42/89/99 --
    they are all getting a little busy and not strictly about what I
    want to sound off about.
    
    The question in the title is 'what is a high-level language'.  The
    VMS literrature talks about 'higher-level' languages to mean something
    other than assembler (ie macro), usually meaning something like
    pascal or C or fortran, occasionally you will see references to
    Ada as well.
    
    I get really fed up (I was going to say "pissed off", but decided
    against it for the sensitive people) with people talking about Pascal
    and C and Ada as if they were the be-all and end-all of language
    design ... as if they really were truly "high-level" languages.
    
    A high-level language, I was always taught, is something that embodies
    a level of abstraction which is above the Von Neumann machine it
    is (usually) constrained to run on.  C hardly quailifies on that
    count:  you can see PDP11 assembler lurking behind a lot of the
    alleged features.  (I use C a lot as an example because I've used
    it a long time.)  Pascal is little better.  In fact what Pascal
    makes up for (unless you start looking at the numerous 'extensions')
    in abstraction it loses in utility.  Have you ever tried to write
    a substantial program in 'pure' pascal?  It's not easy.
    
    Lisp and Prolog are good candidates for a high level of abstraction,
    though not many people understand Prolog (in particular) well enough
    to write, say, Ultrix, in it.
    
    For a long time, I used to claim that a high level language required
    two things: string handling an a garbage collector.  This almost
    qualifies Basic as a high level language though.  If you have a
    *true* garbage collector, not just a language simple enough to be
    able to do a 'free' on data structures as they are done with, then
    you probably do have a high-level language.
    
    Still, that's not enough.  High-level has come to mean something
    else, to me at least: it means that the language lets me get on
    and write the program I want to write without worrying about the
    irrelevent, and, in addition, does its best to spot my mistakes
    without being too demanding on what information I have to give to
    do that.  That really means that the language should include, somehow,
    a specification language sufficiently rich that the compiler can
    spot most of the mistakes that you are likely to make (apart from
    incorrect algorithms).  On this basis, C, Pascal, Fortran, possibly
    Ada and Modula2 (I'm not sure), fall down.  Add the garbage collection
    and string manipulation and both Ada and Modula2 don't look too
    good any more.
    
    I have, more or less, been describing a language I have been using
    quite a lot recently, so I'm not just up in the air about this alleged
    high-level language stuff.  I won't prejudice you by saying what
    it is ... not just yet at least.  I program for fun, often, not
    just because it's work, and I like to write programs which work,
    or add to large programs which aleady work and *know* that they
    will still work when I have finished.  So I like to use high level
    languages.  It helps that the first language I learned was Algol68,
    which is still a pretty snazzy language ...
    
    							jch
T.RTitleUserPersonal
Name
DateLines
103.1A rose is a sweet as you like.LOGIC::VANTREECKFri Sep 19 1986 14:2749
    A manager-type called me up a week or so ago. He noted that I
    had worked in PROLOG compiler development, and asked me what
    PROLOG was. I gave a brief explanation. He then wanted to know what
    experience I've had with 4th generation languages. Sigh, "4th
    generation languages?". "Yes, languages like DATATRIEVE, TEAMDATA,
    RALLY?". Bigger sigh, "No, not much experience with 4th generation
    languages." :-}
    
    The moral is: The definition of higher level, 4th generation, etc., may
    be quite different depending on who you're talking to. If you ask a
    language theroist to give an example and ask a marketeer or SWS person
    in the commercial applications the same question, you're likely to get
    very different "examples". But both would probably give very similar
    answers on what constitutes a higher level language! That is, both
    would probably agree that the higher the level of abstraction the
    higher the level of the language. And both would tell you that their
    example of a 4th generation language is valid because it is much
    more declaritive than some other language.
    
    In my opinion, there's so much "slop" in the definitions that they
    are virtually useless. And any argument about the "level" of a
    language is likely to quickly degrade into a religious argument
    -- where there are never any winners.
    
    But I love to argue -- even when I know I can't win. So, I'm going to
    throw this your way: DATATRIEVE, TEAMDATA, RALLY, are nothing but
    application programs, i.e., they're nothing but
    database-forms-report-generation tools with a fancy command line
    interpreter. Anybody who calls them 4th generation ought to be shot and
    survivers sued for false advertising.
    
    Ada, Pascal, and MODULA-2 have a much higher level of data abstraction
    than FORTRAN, BASIC, COBOL, PROLOG, and LISP. But all of these
    languages are very weak at execution abstraction. 
    
    PROLOG does a very good job of abstraction on the execution model,
    i.e., declarative description of what is true rather than giving a
    procedural definition (algorithm). In this aspect PROLOG is a much
    higher language than LISP, Ada, Pascal, MODULA-2, FORTRAN, BASIC, etc..
    But PROLOG is extremely weak at data abstraction -- at the level
    of FORTRAN and BASIC.
    
    LISP is somewhere in between PROLOG and FORTRAN in declartiveness.
    And LISP is somewhere in between Ada and FORTRAN in data abstraction.
    You might say it does a half-a** job everything.
    
    There! Take that you language freaks! 
    
    -George
103.2hash and rehashCLT::GILBERTeager like a childSat Sep 20 1986 01:5610
We've debated this before in this forum.  The rough conclusion is that
a 4GL allows groups of things to manipulated as easily as single things
are manipulated in a 3GL.  DATATRIEVE, TEAMDATA and RALLY qualify (an
RDB paradigm helps), but also APL, and possibly RPG!  LISP and PROLOG
probably *don't* properly fit this simple categorization, yet PROLOG is
a fairly high-level language (though somewhat specialized).

Program generators are nGLs, where n is whatever the marketeers are willing
to claim.  But if n is greater than 4, they're fibbing.  I'd expect to see
a generally accepted definition of a 5GL before the end of the century.
103.3VOGON::HAXBYJohn Haxby -- Definitively WrongSat Sep 20 1986 10:0038
>                                          The rough conclusion is that
>a 4GL allows groups of things to manipulated as easily as single things
>are manipulated in a 3GL.                
    
    I disagree with this: Algol68 lets you treat collections of things
    as single things:  you want to add two 'person' records with the
    plus symbol (+)? No trouble.  However, I wouldn't call '68 a 4GL.
    
    I would regard Prolog (and the like) and the object-oriented languages
    (*not* Modula2 and Ada, they aren't, not really) as 4GL, taking
    fourth generation simply as a step beyond the third.
    
    However, .1 has it, more or less, high level languages, surely,
    are those which make useful abstractions possible.  Prolog does
    if you are doing that kind of thing.  Object-oriented languages
    also have a high-level of abstraction: there's an object here and
    we can do things to it.  For My money the OO languages are more
    useful than Prolog et al bacause I can write editors using them,
    like this one.  The abstractions help me get on with what I want
    to do.  They don't get in the way.  Maybe a high-level language
    is simply that: a language that goes out of its way to help you
    write correct (provably?) programs and doesn't get in the way of
    what you are trying to do.
    
    I wouldn't even stoop to calling RALLY, TEAMDATA and DATATRIEVE
    languages.  Maybe I should say 'programming languages' -- query
    languages are something else altogether, though I can't admit to
    much enthusiasm for them.  Mind you I know next to nothing about
    that particular trio, so ...
    
    Still ... does any one know of a language which really does have
    a high level of abstraction but which is also sufficiently
    well-designed that it makes writing in-correct programs harder than,
    say, in C?  Does it include a specification language?  (ever try
    to read Ultrix code? no comments and no specifications anywhere,
    why do you think we all look like we do?)
    
    							jch
103.4We didn't sell $10M in a year for nothing!TLE::MEIERBill MeierSun Sep 21 1986 19:3331
< Note 103.3 by VOGON::HAXBY "John Haxby -- Definitively Wrong" >

    Still ... does any one know of a language which really does have a high
    level of abstraction but which is also sufficiently well-designed that it
    makes writing in-correct programs harder than, say, in C?  Does it include a
    specification language?  (ever try to read Ultrix code? no comments and no
    specifications anywhere, why do you think we all look like we do?) 

I believe Ada provides a high level of abstraction; with the proper design of a
system, Ada gives you the power to define arbitrary datatypes, and overload (or
restrict) all the standard operators. If you want to define "+" for your private
data type you can. If you want to prohibit it, you can. With proper packageing,
you can hind implementation details, and present any level of abstraction to the
user. Yes, its still a procedural oriented language, but it is also gaining
popularity as an AI language as well. 

I feel any strongly typed language, such as Ada, is sufficiently well-designed
that it makes writing in-correct programs harder. For every subprogram call, for
every assignment, for every operator you are assured that the correct objects
are being used. I will contrast this to Bliss (which I'l rate along with C);
very often, I'll write a small to medium Ada program, and as soon as I get it to
compile, it runs without bugs. Bliss, on the other hand, generally "compiles"
the first time; but requires debugging before it really works. And, I consider
myself a very proficient Bliss (and Ada) programmer. 

Does it include a specification language? Yes! The language is so good, that the
specification language is itself! This is demostrated by the fact that many of
the Program Design Languages being developed today (including those at DEC), are
based on, either directly or indirectly, on pure Ada. 

Ok, I admit it; I'm biased - I'm an VAX Ada developer! 
103.5High-level data abstraction and the tools to operate on the datDREGS::BLICKSTEINDaveMon Sep 22 1986 11:2176
>    I get really fed up (I was going to say "pissed off", but decided
>    against it for the sensitive people) with people talking about Pascal
>    and C and Ada as if they were the be-all and end-all of language
>    design ... as if they really were truly "high-level" languages.
    
    You and me both.

>    A high-level language, I was always taught, is something that embodies
>    a level of abstraction which is above the Von Neumann machine it
>    is (usually) constrained to run on.  
    
>    Still, that's not enough.  High-level has come to mean something
>    else, to me at least: it means that the language lets me get on
>    and write the program I want to write without worrying about the
>    irrelevent, and, in addition, does its best to spot my mistakes
>    without being too demanding on what information I have to give to
>    do that.
    
    I think you've stated it quite well.  And in my opinion, I think
    it's evident that you can't talk seriously about C in this vein;
    Pascal's somewhat better. Bill Meier appropriately points out (but
    not using these words) that Ada's data abstraction can be used to
    EXTEND Ada into a more abstract language, but note that the extension
    packages themselves are strict Von Neumann.
    
    I don't hold APL up as a 4GL, but I do like to point out that APL
    (which was invented in 1957, almost down the hall from Backus when
    we was inventing FORTRAN) embodies many 4GL concepts that even recent
    languages missed.
    
    Now, what I'm going to do is often what I admit to be a contrived
    example, but I think that it still valid for demonstrating my point.
    
    Assuming you don't use non-general builtin functions, look what
    is necessary to calculate an average in the various FORGOL (my term
    for FORTRAN derived languages, which include Pascal, C, and Ada)
    languages:
    
    	1. A summation variable (and perhaps a declaration for it)
    	2. An initialization for the summation variable
    	3. A syntactic structure for a loop
    	4. An loop counter (and perhaps a declaration)
        5. A accumlation assignment
    
    All of these things are synthetic.  They are irrelevant and are
    only there do to the limitations of the languages.   An average
    is a sum divided by the number of elements.  Thus the most efficient
    method of notating that is to directly specify that directly.  That
    is you want to specify WHAT the answer is, NOT how to compute it
    on a serial computation device.
    
    I think most people would find:
    
    			SUM(A) / SHAPE(A)
    
    more obvious than
    
        DCL S INTEGER
    	DCL I INTEGER
    	S = 0
    	Do I = 1 to n
    	   S = S + A(I)
    	   end
    
    There is a large amount of "noise" in the second example that makes
    it both more difficult to read, and difficult to program.
    
    I think the key thing for a high-level language is to identify
    the high-level abstract datatypes (in APL this is arrays), and then
    provide a rich set of functions, operators and syntax (if necessary
    (APL has hardly any syntax; in fact the VAX APL syntax parser, which
    was a state machine had about 13 states total!!!)) to manipulate them 
    (in APL, the high level operators are reduction, compression,
    expansion, etc.)
    
    	db
103.6LOGIC::VANTREECKMon Sep 22 1986 12:5323
    RE: 4GL implying the handling of groups of items at a time.
    
    I would like to remind you that PROLOG is based on the deductional
    database model, i.e., it is a superset of the relational model.
    Consider the following PROLOG rule (where 'likes' is defined
    as infix operator, ':-' is like 'if' in other languages):
    
    		judy likes _something :- mary likes _something.
    
    Note that this single rule deals with the set of all things
    that Mary likes, i.e., no 'for' loops, no 'while' loops, etc..
    
    RE: program correctness.
    
    This is where PROLOG really shines above all other VAX/VMS languages.
    1) PROLOG does not have destructive assignment which makes writing
    PROVABLY correct programs much easier! Ada, MODULA-2, Pascal, etc.,
    can not claim this. 2) PROLOG is an acronym for "programming in logic",
    i.e., its inference engine uses first order logic to produce all
    results! If you're concerned about program correctness PROLOG is
    the best language to use.
    
    -George
103.7Implementation----------^level^-----------problemMLOKAI::MACKa(2bMon Sep 22 1986 23:5245
    According to typical object-oriented "doctrine", in an adequate
    program, the implementation represent a perfect "reverse image" or
    "model" of the space surrounding the problem. 
    
    As I understand the term, a 4GL involves working primarily in the
    language of the problem space, with minimal reference to implementation
    (allowing the language to choose the appropriate implementation), i.e.
    a degree of automatic programming.  Obviously this works best within
    limited spheres that are well-understood, thus items like the COBOL
    generator and DTR. 

    A good 3GL has to be able to be *used* by the programmer to build his
    model of the problem space in a language that summarizes and condenses
    the capabilities of the implementation.  This means the programmer
    chooses the mapping, but the tool assists the programmer.  Such a
    language has to allow for many different kinds of items.  This requires
    a language with a rich *variety* of data types, methods of aggregating
    them, and ways of operating on the aggregates.  In addition, it
    must allow both a broad and fine brush for describing the implementation.
    
    From what (admittedly little) I know of APL, it is so-so as a 3GL --
    somewhat weak on variety of primitive data and has only one serious
    method of aggregation: arrays.  In what way is it an effective 4GL?  I
    don't know if any language qualifies as a 4GL for a broad class of
    complex problems by the above understanding. 
    
    PROLOG seems like something totally other: using mathematics as an
    intermediate language between the problem space and the implementation
    space.  It's advantage is that it is susceptible to analysis and
    therefore has some chance of being evaluated for "correctness". I've
    never used PROLOG.  I get the impression that you have to be a
    mathematician to use it effectively, and I never cared much for
    math -- I've always found it hard to understand in terms of things
    I can see and touch. 
    
    How close does PROLOG allow you to get to simply describing the problem
    in a form readable to laymen?  And how much math do you have to know in
    order to use it?  Finally, is the state of the art in implementation of
    PROLOG anywhere near practical for *any* class of real-world problems? 
    
    						Ralph

P.S. BTW, a 5GL would eliminate the need for a specialist in software,
     right?  (Anyone could do it.)
103.8CLT::GILBERTeager like a childTue Sep 23 1986 01:4529
re .3

    On writing .2, I realized that a language with overloaded operators
    could be built into one that handles collections as easily as single
    elements.  But this feature is not built into Algol or Ada -- it's a
    layer that must be provided by the programmer.

re .6

    Thanks for the correction about PROLOG.  In it, can I (for example)
    say that 'likable' things are soft and cuddly, and define a function
    that, when passed the contents of my toy-box, returns the things that
    have both these attributes?

    The lambda calculus (an assignment-free language) was invented to
    provide a language in which programs could be proved correct.  It's
    no wonder that one of its progeny should be acclaimed for this.

re .7
    
>   As I understand the term, a 4GL involves working primarily in the
>   language of the problem space, with minimal reference to implementation

    First, I'll disagree with the definition.  That aside, ...

    This is quite an interesting metric -- the closeness with which a
    language can model a problem.  The object-oriented languages should
    be highly rated, as should architectural analysis systems, and most
    WYSIWYG systems.
103.9Assignments statements aren't that hardTLE::FELDMANLSE, zealouslyTue Sep 23 1986 12:1225
    Re: program correctness and destructive assignment:
    
    The semantics of assignment are well understood, and have been defined
    for languages such as Pascal and Ada.  I don't believe that the
    assignment statement makes programs qualitatively harder to manipulate
    formally (i. e. the same tactics work), although they undoubtedly
    increase the size of the clauses being manipulated.  I'm not really
    sure whether this increase is significant, particularly since I
    expect the manipulations to be automated.
    
    The largest instance of a program verification that I've seen is a thesis
    out of Stanford (or perhaps Berkeley) dealing with a compiler.  To the
    best of my recollection, the base language was Pascal or something
    similar.   The thesis was publised by Springer-Verlag in their Computer
    Science series, but since I haven't looked at it for about three
    years, I apologize for not having more details.
    
    I'm curious as to whether anyone knows of other large programs being
    verified?
    
    I quite agree with the second statement about Prolog, i. e. that
    since it uses first order logic, it is quite easy to prove assertions
    about Prolog programs.
    
       Gary
103.10DREGS::BLICKSTEINDaveTue Sep 23 1986 12:3162
    re:  103.7 by MLOKAI::MACK "a(2b" >
    
>    From what (admittedly little) I know of APL, it is so-so as a 3GL --
>    somewhat weak on variety of primitive data and has only one serious
>    method of aggregation: arrays.  In what way is it an effective 4GL?

    Well, first of all I thought I had made it clear that I wasn't touting
    APL as a 4GL (from my reply in .5):
    
    	"I don't hold APL up as a 4GL..."
    
>    As I understand the term, a 4GL involves working primarily in the
>    language of the problem space, with minimal reference to implementation
    
    The problem space for APL IS arrays, thus your reason for excluding
    APL from the label of 4GL seems to be somewhat in conflict with
    your understanding of the definition of a 4GL.
    
    What I take your understanding of 4GL to mean is a language with
    specialized representations for data, and high-level functions to
    operate on the data.
    
    Effectively, everything in APL is an array.  All of the APL functions
    operate on arrays.  APL has many high-level operations specific
    to arrays such as reduction, inner product, outer product, etc.
    From combining these functions we derive even higher level functions.
    For example, combining the plus function with the reduction operator
    gives us sum, combining plus and multiply with inner product gives
    us matrix multiply.
    
    I think my average example is a good demonstration of how the
    existance of high level functions minimizes references to the 
    implementation, unlike the FORGOL example given.  Unfortunately,
    I tried to put essence of the APL form of the solution into a
    syntax that non-APL people would understand (this is because experience
    tells me that people will call any notation "unreadable" if they
    don't happen to already understand the notation).
    
    In APL, average is expressed as (+/A)/(@A)    (where @ would be
    the shape function).   Using a more familiar notation, this would
    be equivalent to:
    
    		REDUCTION(+,A)/SIZE(A)
    
    Note that absence of implementation details like loops, index
    variables, sum variables, assignments, etc.
    
    APL IS much worse than a "so-so" 3GL if you are attempting to use
    it as a 3GL (i.e. you don't use it's intrinsic functions, or if
    the application isn't well suited to an array representation).
    
    	db

    A good 3GL has to be able to be *used* by the programmer to build his
    model of the problem space in a language that summarizes and condenses
    the capabilities of the implementation.  This means the programmer
    chooses the mapping, but the tool assists the programmer.  Such a
    language has to allow for many different kinds of items.  This requires
    a language with a rich *variety* of data types, methods of aggregating
    them, and ways of operating on the aggregates.  In addition, it
    must allow both a broad and fine brush for describing the implementation.
    
103.11LOGIC::VANTREECKTue Sep 23 1986 20:52110
    RE: .8
    
    >In it, can I (for example)
    >say that 'likable' things are soft and cuddly, and define a function
    >that, when passed the contents of my toy-box, returns the things that
    >have both these attributes?
    
    Yes, it is quite easy to implement inheritance in PROLOG. DEC being
    a member of MCC means you can get the MCC paper on BIGGERTALK.
    BIGGERTALK implements inheritance and object oriented programming
    in PROLOG. I think you would find it very interesting. 

    >The lambda calculus (an assignment-free language) was invented to
    >provide a language in which programs could be proved correct.  It's
    >no wonder that one of its progeny should be acclaimed for this.
    
    Alas, pure LISP (lambda calculus) is not useful for real world
    programming. I'm told that versions with much needed features (e.g.,
    Common LISP) have so badly abused the paradigm that most programs
    written in such LISPs would probably be very difficult to prove
    correct. 
    
    re: response about destructive assignment
    
    True, destructive assignment does not make it impossible to prove
    correctness. But it complicates it greatly for anything but the most
    trivial programs. I believe the proofs by Writh and his understudy of
    the correctness of Pascal semantics does not imply correctness of
    Pascal programs. Knowing that the building blocks are sound does not
    mean the resulting structure comprised of those blocks is sound. It
    simply gives confidence that it is possible to prove that SOME Pascal
    programs are provable. PROLOG tries to impart correctness to PROLOG
    programs. That's a BIG difference! 
    
    Unlike LISP implementers, PROLOG implementers have been slow to add
    real world features like modules, a firm notion of types, etc.. This is
    because they are concerned about keeping the language as 'pure' as
    possible, i.e., retain as much of a sound logic foundation as possible.
    Even current current violations of this purity, e.g., assert and
    retract, are accepted only because the language would be totally
    useless without them. Even cut and IO routines, which destroy much of the
    declarative nature of a program are accepted only as a necessary evils.
    I'm concerned about maintaining purity, but I'm not nearly as fanatical
    about it as some. I've got work to do, and I don't want some theorists
    tying my hands. 
    
    re: PROLOG being for mathematicians.
    
    There's some validity to that, in that some implementations are rather
    short on features for an industrial strength language. But that's
    changing quickly, e.g., Quintus PROLOG (marketed by DEC). One does not
    need to be a mathematician at all to understand and use PROLOG. The
    biggest problem with learning to program in PROLOG is that long time
    hackers used to programing in procedural languages sometimes have
    difficulty thinking and programming in a declarative, recursive, style.
    
    An example for the non-mathematician: Suppose you wanted to write a
    procedure that would take a linked list and append it to another linked
    list. Or if given the appended linked list and one of the pieces,
    return the other piece. Or if only the appended list is given, return
    the set of all possible combinations of two lists that would give the
    appended list. Or if two lists are given check to see if the append of
    those two lists matchs some third list. That program requires two lines
    of code in PROLOG. 
    
    		append([], _list, _list).
    		append([_f|_t1], _l, [_f|_t2]) :- append(_t1,_l,_t2).
    
    The first line says that the append of an empty list to a list is the
    list itself. The second rule says that the append of a list with first
    element, _f, and tail, _t1, to the list, _l, is a new list with the
    first element, _f, and a new tail, _t2, 'if' the new tail, _t2, is the
    append of the tail of the first list, _t1, to the list, _l.
    
    If you had to write two rules that described the results of appending
    two lists, could you think of anything more simle and straight forward? 
    
    If you called the procedure 'append' with the arguments:
    
    		append([a,b,c],[foo,to,you],_result).
    
    PROLOG would return the result: [a,b,c,foo,to,you].
    
    If you called the procedure 'append' with the arguments:
    
    		append(_first, _second, [a,b]).
    
    PROLOG would return the results:
    
    	_first = []
        _second = [a,b]
    
    	_first = [a]
    	_second = [b]
    
    	_first = [a,b]
    	_second = []
    
    If 'append' were called with arguments that can not be proved to
    be true, it will fail and return a "No.". For example:
    
    	append([a,b],[c,d],[a,c,b,d]).
    
    Would fail and return a "No.".
    
    Can Powerhouse or any other 4GL language implement that procedure in
    such a simple, straight forward, small, routine and garunteed logically
    correct results to boot? 
    
    -George
103.12Casting stones at a house of smokeCLT::GILBERTeager like a childWed Sep 24 1986 01:0913
>   Can Powerhouse or any other 4GL language implement that procedure in
>   such a simple, straight forward, small, routine and garunteed logically
>   correct results to boot? 

    Easy, now.  It's easy to be misled into believing that humans are
    the greatest creation in the evolutionary tree, and that 4GLs are
    superior to 3GLs.  But both liverwurts and 3GLs have their uses, too.

    Perhaps I'm wrong.  Perhaps an nGL is always bigger, better, new
    and improved; more powerful, less bug-prone, easier to use, sexier,
    faster, smaller and more cost-effective than any piddling (n-1)GL.
    Perhaps we should mourn that there is no Adlai Stevenson IV, and
    marvel at the wonders of John D. Rockefeller VI.  Perhaps not.
103.13Lost in space(s)...MLOKAI::MACKa(2bWed Sep 24 1986 11:5368
    Re .10:
    
>    The problem space for APL IS arrays, thus your reason for excluding
>    APL from the label of 4GL seems to be somewhat in conflict with
>    your understanding of the definition of a 4GL.

    An array cannot possibly be part of a problem space.  
    
    A problem space is an office with managers and secretaries and too much
    paper or a shipping dock with grimy invoices or a chemical tank with
    various chemicals in it or a smokestack venting air of uncertain
    quality. 
    
    An array is an abstraction in the implementation space.  (Or perhaps
    these "programming abstractions" represent yet another space?)

--------------------------------------------------------------------------
    
Re PROLOG:
    
    It occurred to me after writing .7 just what kind of an "other" thing
    PROLOG is.  We can postulate a third space besides the problem and
    implementation spaces.  This is the validation space.  The reasoning
    behind using PROLOG is:

    1) The programmer has to perform a translation between the problem
       space and some other space which may or may not be similar.
    2) The software will perform the translation between that intermediate
       space and the physical implementation (machine language).
    3) Any mapping directly to the implementation space bears the risk
       of errors of inconsistency.
    4) Why not let the programmer perform the translation into a space
       that only permits self-consistent programs?  
    
    Note that this doesn't necessarily mean correct from the user stand-
    point.  It can still have been written to do precisely the wrong thing. 

    Am I on the right track, PROLOG folks?

----------------------------------------------------------------------------
Newer thoughts:
        
    I guess, given the trend in the above ramblings, that we really have to
    treat the conceptual environment of each language as a space in its own
    right, seperate from both the problem and implementation spaces.  
    
    The trick, then, in choosing a tool to solve a problem is to find a
    language whose "solution space" maps closely to the problem space (less
    chance of solving the wrong problem) with various desireable character-
    istics, like verifiability, fast execution, etc. 

    The term generation refers more to age and which languages it sprang
    from than any inherent assumption of quality, appropriateness, or
    "level". In this way, 4GL's would include both languages that map more
    closely to specific problem spaces than their predecessors and those
    which provide more of the desireable characteristics. 
    
    I am trying to come up with a workable approach to some very loosely
    defined terms.  Words, like languages, are simply tools, and I want
    to make sure that the words being used are effective tools of *both*
    communication and thought.  This means they have to both mean something
    useful and mean nearly the same thing to different people.  Is the
    above a good springboard or have I missed the mark?
     
    							Ralph
    
    Terms: Problem space, Solution space, Implementation Space, Generation,
           Level.
103.14LOGIC::VANTREECKWed Sep 24 1986 12:1718
    re: .13
    
    The "other" use of PROLOG you suggest is correct. A problem with
    4GLs is that the great generality tends to result in less efficient
    execution for specific applications.
    
    In my opinion, 4GLs are currently good for quick prototyping and being
    used as a functional spec for the implementation in a lower level
    language. The Japanese MITI organization decided to use 5th generation
    computers that are so fast that it wouldn't matter if a 4GL is a little
    inefficient. They believe the next generation computers must do more
    than manage data. They must also be able to make "intelligent" use of
    the data. That is why they chose a 4GL (PROLOG) with symbolic
    processing capability that is amenable to implementation on parallel
    architectures as the target language, i.e., the design of the computer
    is tuned to efficient execution of logic programs. 
    
    -George
103.15Definition of 3GL/4GLMIRFAK::BROOKEIntelligence as applied abstraction.Wed Sep 24 1986 13:1434
    I believe that reply <.7> is a reasonable method of defining 3GL/4GLs.
    This is undoubtably because it is quite similar to my own method
    of defining these animals.
    
    It is my contention that a computer language generation should be
    defined by its usage external to any particular language features.
    Here are two or three:
    
    3GL)
    	A third generation language should allow effective programming
    (by programmers) without detailed knowledge of the underlying machine.
    This is as opposed to 2GL (assembly code) and 1GL (machine code)
    which do require such knowledge.
    
    4GL)
    	A fourth generation language (1) should allow programmers to
    increase their productivity (not necessarily coding or typing speed)
    by a substantial amount (3-5 times is typically quoted for real
    4GLs), (2) should reduce the expertise required of the programmer,
    (3) substantially increase maintainability, and (4) allow immediate
    prototyping in itself.  Note that I do not mention system analysis.
    
    5GL)
    	A fifth generation language should essentially allow effective
    programming by non-programmers.  Such a system will undoubtably
    require a hefty dose of AI.  I personally believe that expert systems,
    at least as they are today, will not provide sufficient functionality
    to accomplish this.  A new, major breakthrough or breakthroughs
    will be required.

    
    				Philip Brooke
    				Future Generations, Inc.
    				617-948-7812
103.16Pound it with APLTLE::HOBBSWed Sep 24 1986 15:176
RE: .10, .13 (talking APL as a 4GL by making arrays be the "problem space")

I am reminded of the statement:  "To a man with a hammer the whole
world looks like a nail."  which leads me to say:

To a person programming in APL the whole world looks like an array.
103.17DREGS::BLICKSTEINDaveWed Sep 24 1986 18:5110
    re: .16
    
    The statement that arrays were the problem space, was meant as part
    of my analogy between 4GL properties and APL features.  I didn't
    mean to imply that all the objects in all problems in the world are arrays.
    
    To the FORGOL programmer all reduction functions in the world 
    look like a loop.
    
    	db
103.18... and APL is ItCGHUB::CONNELLYEye Dr3 - Regnad KcinSat Sep 27 1986 00:179

I don't know what a "higher level language" is defined to be,

		... but I know one when I see one ...



:-)
103.19Software concepts aren't realMLOKAI::MACKa(2bSat Sep 27 1986 23:3611
    Re .17:
    
    I might go further.  No objects in real-world problems are arrays, nor
    are any problems loops.  These are just tools to represent both the
    objects and the ways to act on them, hopefully resulting in a
    representation of the solution. 
    
    					Trying to hold a firm line between
    					reality and software,
    
    					    Ralph
103.20BIZET::VANROGGENMon Sep 29 1986 10:567
    re .19:
    
    A common exception might be various language interpreters, such
    as compilers. Dealing with loops and arrays efficiently is definitely
    a real-world problem.
    
    			---Walter
103.21Oops...forgot about programs for programmersMLOKAI::MACKa(2bMon Sep 29 1986 21:503
    OK.  I'll grant you that.
    
    			-- Ralph
103.22Concretizing AbstractionsCIM::JONANWe should&#039;ve stopped at fire...Wed Oct 01 1986 11:2611
    Re: .19-.21
    
    Virtually anything can count as a "real" object or "real world
    problem" - it just depends on what your problem space is.  Certainly
    arrays (matrices) are "real objects" presenting "real problems"
    to, say, a mathematician working on some aspect of linear algebra.
    The same sort of thing can be said about algorithms (just look at
    Knuth's ongoing treatise!)  One person's abstraction is another
    person's brick wall!
    
    /Jon
103.23back to the Basics?CGHUB::CONNELLYEye Dr3 - Regnad KcinSat Oct 11 1986 00:0022
Sorry, but I tend to think of "high level" in terms of "how
many statements in language X does it take to express a
function that can also be done in languages Y, Z, etc."?

By that standard, APL is far and away the "highest level"
language.  MUMPS and ICON come pretty close.  Most other
languages are not even in the running, unless you start
counting database manipulation languages (DMLs) as part
of the domain in question.

Formal verification of program "correctness" is made much
easier by strong typing and other such manipulations, but
that does not necessarily make the language more "high level"
from the standpoint of the programmer.

An ugly but meaningful question when discussing applications
is "how fast can I code this application and get it up and
running?".  A secondary question is "how easy is it to find
and debug errors in my application?".  For both questions,
APL and MUMPS are far superior to "verifiable" languages
like PASCAL and MODULA-2.
103.24APL?VOGON::GOODWINPancake seated; Tree watchingMon Oct 13 1986 04:468
    Pardon my ignorance, but doesn't APL use a rather large set of
    operators? The real APL also needs a special keyboard, as I understood
    it. In what way is it superior? Whenever I've seen APL it looks
    like goblegook. I mean I find it confusing to read! Are the expressions
    executed right to left? And finally, isn't it an interpreted language?
    
    I agree, with its richness of operators and functions it may be
    very fast and easy to write code, but can anyone read it afterwards?
103.25Why do languages have to look like FORTRAN to be readable?DREGS::BLICKSTEINDaveMon Oct 13 1986 12:1456
>    Pardon my ignorance, but doesn't APL use a rather large set of
>    operators? The real APL also needs a special keyboard, as I understood
>    it. In what way is it superior? Whenever I've seen APL it looks
>    like goblegook. I mean I find it confusing to read! Are the expressions
>    executed right to left? And finally, isn't it an interpreted language?
    
    Let's get something straight here:  Is one of the requirements for
    a "superior" language that it look enough like FORGOL (my name for
    FORTRAN and its one-plus descendents (ada, pascal, algol, etc.)) such that
    it is not "confusing" to read in the absence of any knowlege about
    the new language?
    
    That places a rather severe limitation on new languages.  I mean I'm
    sure that any speaking language (greek, hebrew, etc) that uses
    different symbols looks like "goblegook" to you, but does that make 
    them inferior?
    
    You think APL is unreadable because you don't understand the symbols.
    That's like saying Japanese is unreadable!  Once you understand the 
    symbols, APL is arguably MORE readable than FORGOL because the symbols
    represent high level constructs that must be constructed by using
    several primitive constructs in FORGOL.
    
    	db
    
    P.S.  To answer your other questions:
    
    	1) Yes, APL uses a larger set of operators than FORGOL.  Most
           people consider this an advantage.  Those operators typically
           replace "idioms" in FORGOL (loops, etc.) with a single character
           that clearly identifies the operation.
    
        2) VAX APL does not require a special keyboard (unless you consider
           a VT220 special).   Recent advances in video terminals have
           reduced the need for special hard.  (The terminals are "soft"
           enough to satisfy APL's requirements.)  Also almost all APL
           implementations allow you to use non-APL terminals for
    	   APL programming, and ALL implementations allow you to use
           an APL application on a non-APL terminal.
	
    	3) Most APL implementations are interpreters.  There are many
           APL compilers out, although the absence of declarations makes
    	   compilers difficult to implement.  However, a good APL
           implementation wil run like a bat out of hell on vector
    	   machines.  The reason for this is that the parallelism of
    	   arrays is INHERENT in APL, where as it has to be detected 
    	   and extracted with very complex analysis in most FORGOL
    	   languages.  	Also certain kinds of vectorization of loops in 
    	   FORGOL is not legal (even though very desireable in most cases) 
    	   due to "conflicts" resulting from order-of-evaluation rules in 
    	   the non-vector case.  In fact, many of the vectorizing compilers
           around actually have a mode where they ASK PERMISSION (or
           require you to declare permission) to do certain kinds of
           vectorizing transformations.  Such things are practically
           unnecssary in APL.
    
103.26Goblegook...gobledegookVOGON::GOODWINPancake seated; Tree watchingTue Oct 14 1986 09:4238
>    Let's get something straight here:  Is one of the requirements for
>    a "superior" language that it look enough like FORGOL (my name for
>    FORTRAN and its one-plus descendents (ada, pascal, algol, etc.)) such that
>    it is not "confusing" to read in the absence of any knowlege about
>    the new language?
    
    Perhaps I should have made it clear I was expressing a preference.
    After all, it's certainly easier to code in Pascal, Algol, etc.
    if you've learnt one you've probably learnt another. My knowledge
    of APL is non-existant, plus I've not seen much use of it, but that
    is simply my ignorance!

    It would be terrible if all new languages were rigidly based on
    FORTRAN, PASCAL etc. but then aren't languages usually based on
    a specific need? OCCAM serves the Transputer, FORTRAN for
    Scientific work, C from Unix for systems stuff, and so on!
    
>    That places a rather severe limitation on new languages.  I mean I'm
>    sure that any speaking language (greek, hebrew, etc) that uses
>    different symbols looks like "goblegook" to you, but does that make 
>    them inferior?
>    
>    You think APL is unreadable because you don't understand the symbols.
>    That's like saying Japanese is unreadable!  Once you understand the 
>    symbols, APL is arguably MORE readable than FORGOL because the symbols
>    represent high level constructs that must be constructed by using
>    several primitive constructs in FORGOL.
    
    Hmm. I can understand Greek symbols, and I speak a few words of
    Greek...
    
    Does this make sense to you if I don't tell you which language it
    is?
    
    : TEST 100 0 DO I CR . LOOP ;

    And finally, one more question, are APL interpreter's/compiler's
    written in APL?
103.27CIM::JONANMe/Gauss = Cow/CalculusTue Oct 14 1986 11:3617
    Re: .25

>    Let's get something straight here:  Is one of the requirements for
>    a "superior" language that it look enough like FORGOL (my name for
>    FORTRAN and its one-plus descendents (ada, pascal, algol, etc.))

    Yeah, let's get something straight: FORGOL is a *"misacronym"* ( :-) ).
    Ada, Pascal, and Modula are ALGOL class languages but FORTRAN is definitely
    NOT.  FORTRAN is NOT block structured (everything has the same scope
    - global), does NOT have recursive capabilities,  does NOT have dynamic
    allocation, and does NOT have free form syntax (for example, DOI can be
    the same as DO I).  These are major conceptual differences!  About the
    *only* thing that FORTRAN has in common with ALGOL class languages is
    that it happens to be a procedural language (but then, even LISP is in
    this category).
    
    /Jon
103.28DREGS::BLICKSTEINDaveTue Oct 14 1986 11:4431
re:    < Note 103.26 by VOGON::GOODWIN "Pancake seated; Tree watching" >
>                          -< Goblegook...gobledegook >-

    
>    Does this make sense to you if I don't tell you which language it
>    is?
>    
>    : TEST 100 0 DO I CR . LOOP ;
    
     
    I will gladly answer this question, but to help answer it properly
    I'd appreciate it if you could give me an idea of the point behind
    the question.

>    And finally, one more question, are APL interpreter's/compiler's
>    written in APL?

    I also don't understand the point of this question.  But to answer
    it, I'll say sorta.   VAX APL for example is largely written in
    BLISS however the next version will have certain functions simulated
    in APL.
    
    Also, an APL interpreter written in APL is a trivial thing.  In
    fact, here it is:   .BX 
    
    In general, I'd have to say that writting APL implementation in
    APL is rather uncommon for two reasons:  1) it rarely makes sense
    to do so, and 2) there are better languages for that application.
    
    	Dave Blickstein

103.29Defense of acronym FORGOLDREGS::BLICKSTEINDaveTue Oct 14 1986 12:0420
    re: .27  CIM::JONAN
    
    The FORGOL term, as I defined it does not represent a set of languages of
    similar "class" but rather a set of languages with common ancestry.
    As such, I do not believe it is a "misacronym".

    I believe that it is evident that Ada, Pascal, etc. are very much
    derived from FORTRAN (procedural, declarations, syntax for loops,
    if-then-else, scoping, etc.) whereas languages like LISP, APL, SETL,
    etc. are much easier differentiated from FORTRAN.
    
    This is partly evidenced by the observation that FORGOL programmers
    (too) frequently claim that unfamiliar FORGOL languages are more 
    "readable" than APL.
    
    	db
        
    P.S. Ironically, FORTRAN 8X (the next FORTRAN standard) has all the 
    features that you've mentioned to differentiate the other FORGOL
    languages from FORTRAN.
103.30Bad genealogy, reasonable conceptTLE::FAIMANNeil FaimanTue Oct 14 1986 12:4426
    > I believe that it is evident that Ada, Pascal, etc. are very much
    > derived from FORTRAN (procedural, declarations, syntax for loops,
    > if-then-else, scoping, etc.) whereas languages like LISP, APL, SETL,
    > etc. are much easier differentiated from FORTRAN.
      
    It is evident that there is a direct line of descent from ALGOL,
    through Pascal, to Ada (with side branches for ALGOL-W, ALGOL-68,
    CPL -> BCPL  -> C, PL/I, etc.)  However, the derivation from FORTRAN
    is spurious.  The original FORTRAN and ALGOL had almost nothing
    in common syntactically.  FORTRAN had subroutines, but they looked
    nothing like ALGOL procedures.  Declarations were allowed, but
    not required (due to implicit declarations).  The only loop syntax
    was DO-loops.  There was no if-then-else.  There was no concept
    of scoping with nested blocks.  In short, it looked almost nothing
    like any modern ALGOL-derived language (and not very much like
    modern FORTRAN, either--it is clear that modern FORTRAN has been
    heavily influenced by the ALGOL-derived languages).
    
    However, in a broader sense the "FORGOL" categorization *is*
    meaningful.  Take an algorithm written in one of the "FORGOL"
    languages, and it will probably have a straightforward
    translation into the others.  On the other hand, trying to do
    the same sort of translation between a FORGOL language, LISP,
    APL, PROLOG (choose any two) is much less plausible.
    
    	-Neil
103.31SymbolsVOGON::GOODWINPancake seated; Tree watchingTue Oct 14 1986 13:0632
    Re: .28
    
>>    Does this make sense to you if I don't tell you which language it
>>    is?
>>    
>>    : TEST 100 0 DO I CR . LOOP ;
    
     
>    I will gladly answer this question, but to help answer it properly
>    I'd appreciate it if you could give me an idea of the point behind
>    the question.
    
    I was trying to use an example of another language which has an
    'obscure' set of symbols (i.e. DUP, SWAP, DROP, ROT, R>, >R etc)
    similar to APL. I realise the languages are totally different, but
    assuming you don't know what language the above is written in, is
    it obvious what it means?
    
    I'm trying to say perhaps a language should look like what it means,
    rather than the programmer having to understand what the symbols mean.
    The above language has an underlying philosophy which you must
    understand before it makes sense, i.e.
    
    : TEST 0 DO I CR . LOOP ;
    100 TEST
    
    could be an example. In APL, am I right in saying it evaluates
    expressions right to left? This would be an example of what you
    need to understand before the language's constructs make sense.
    
    Pete.
    
103.32FORGOL <> ProceduralMINAR::BISHOPTue Oct 14 1986 13:1111
    APL is a procedural language.
    
    APL one-liners are non-procedural (as are "FORGOL" expressions).
    
    Backus' "functional programming" was an attempt to generalize the
    clever parts of APL.  I went to a talk he gave at the University
    of North Carolina in 1979 (or about then), and came away conviced
    that this was the coming shape of programming languages.  Since
    then I've seen nothing.  Has anyone else?
    
    				-John Bishop
103.33Does a language have to be like FORTRAN to be readable?DREGS::BLICKSTEINDaveTue Oct 14 1986 15:2080
re:  < Note 103.31 by VOGON::GOODWIN "Pancake seated; Tree watching" >
>                                  -< Symbols >-

    
>    I was trying to use an example of another language which has an
>    'obscure' set of symbols (i.e. DUP, SWAP, DROP, ROT, R>, >R etc)
>    similar to APL. I realise the languages are totally different, but
>    assuming you don't know what language the above is written in, is
>    it obvious what it means?
    
>    I'm trying to say perhaps a language should look like what it means,
>    rather than the programmer having to understand what the symbols mean.
>    The above language has an underlying philosophy which you must
>    understand before it makes sense, i.e.
    
    A valid point, however I'm not convinced that any FORGOL language
    is any better than APL in this regard.  Do you think
    that DO, CONTINUE, CASE, SELECT, BEGIN/END, INCR, DECR, DECLARE,
    etc.  are meaningful to anyone who doesn't already know a FORGOL 
    language?  What do you think your average non-FORGOL person would 
    make of the following statement?:
    
    		A = A + 1
    
    Most of us are so used to FORGOL that you don't even think twice
    about how confusing that must be to a non-FORGOL programmer.
    
    It's my belief that a person who doesn't know a FORGOL language
    is going to have any more luck determining what a particular FORGOL
    program does, than he might with an equivalent APL program.   One
    may be greek, but the other will appear like gibberish.
    
>    : TEST 0 DO I CR . LOOP ;
>    100 TEST
    
>    could be an example. In APL, am I right in saying it evaluates
>    expressions right to left? This would be an example of what you
>    need to understand before the language's constructs make sense.
    
    Yes, this is true for all languages.  Although as I've said, I think
    FORGOL only looks like what it means if you happen to know FORGOL.
    To identify a case statement by using the word CASE, does little to
    help if you have no idea what a case statement is.
    
    Also, the essence of APL's advantages would not be compromised if
    symbolic names were substituted where symbols are now used.  In
    fact, many implementations support this as a way of using APL from
    non-APL capable terminals.  On the other hand, the use of symbols
    helps to differentiate objects from operations.  I'm sure you'd
    agree that  A GETS A PLUS B, is not as readable as A = A + B once
    you understand what = means (in this context) and what + means.
    
    You've indicated right-to-left rule is confusing. That
    rule exists to remove the need for the programmer to remember
    the complex (and differing) set of operator precedences, 
    associativity rules, and order of evaluation rules that are so 
    pervasive among FORGOL languages.  For example, can you off the top 
    of your head tell what the following expression evaluates to in each 
    FORGOL language that you are familiar with:
    
    	2*1/3/4*5
    
    An APL student could easily tell you the answer for APL after his first
    lesson.
    
    If you find the APL rule hard to deal with, it is only because you are
    used to the FORGOL rules, which are actually much harder to initially
    grasp, frequently confused and forgotten, and are responsible for
    no small number of bugs.

    What I'm trying to demonstrate is that what people attribute to
    UNREADABILITY in APL is in each case nothing more than it just being 
    different from FORGOL.  I.E., I think you find APL UNREADABLE only 
    because it is DIFFERENT than what you're used to..
    
    	db
    
    BTW, The term FORGOL is not intended to be negative or derisive
         in any sense.   It is merely a convienent term I've introduced to
         explain my ideas.
103.34I'm skepticalDENTON::AMARTINAlan H. MartinTue Oct 14 1986 17:5475
Re .33:

>    A valid point, however I'm not convinced that any FORGOL language
>    is any better than APL in this regard.  Do you think
>    that DO, CONTINUE, CASE, SELECT, BEGIN/END, INCR, DECR, DECLARE,
>    etc.  are meaningful to anyone who doesn't already know a FORGOL 
>    language?  What do you think your average non-FORGOL person would 
>    make of the following statement?:
>    
>    		A = A + 1
>    
>    Most of us are so used to FORGOL that you don't even think twice
>    about how confusing that must be to a non-FORGOL programmer.

Well, I know a 7th grader who decoded some typical-appearing FORTRAN code
to perform model rocket simulations without any prior knowledge of
programming language syntax.  I can't claim he'd have been able to do the
same with APL.  If the only benefit of semi-intuitive notation was to help
alleviate the software crisis by allowing us to take advantage of the
massive numbers of unemployed 6th graders by pulling them off of street
corners and sitting them in front of terminals, then I agree that it is not
an important characteristic of a general-purpose language.  Alternatively,
if programmers are expected to have some amount of mathematical training,
then finding common ground between the two disciplines has worth.

>    You've indicated right-to-left rule is confusing. That
>    rule exists to remove the need for the programmer to remember
>    the complex (and differing) set of operator precedences, 
>    associativity rules, and order of evaluation rules that are so 
>    pervasive among FORGOL languages.

Too bad it is totally different from the rule which everyone should
know before high-school.

>    For example, can you off the top 
>    of your head tell what the following expression evaluates to in each 
>    FORGOL language that you are familiar with:
>    
>    	2*1/3/4*5
>    
>    An APL student could easily tell you the answer for APL after his first
>    lesson.
    
I don't believe it.  Your example looks like a classic "trick" question
for a first exam.  An APL student would be far more likely to blow
it than a student of honest algebraic procedural languages (HAPLs).  Also,
the trick of using division operations which require knowledge of whether
that expression involves integers and/or floating point wouldn't be so
funny if APL had more than one numeric data type, because then that student
would be in twice as much trouble.

>    If you find the APL rule hard to deal with, it is only because you are
>    used to the FORGOL rules, which are actually much harder to initially
>    grasp, frequently confused and forgotten, and are responsible for
>    no small number of bugs.

I'd be surprised if APL programs as a group have a lower incidence of
missing or misplaced parenthesis errors than HAPL programs, at least
upon the first attempt at execution.  Especially since individual APL
expressions will be longer, and will require more parenthesis on the average.

>    What I'm trying to demonstrate is that what people attribute to
>    UNREADABILITY in APL is in each case nothing more than it just being 
>    different from FORGOL.  I.E., I think you find APL UNREADABLE only 
>    because it is DIFFERENT than what you're used to..

I think it is quite believable that many claims that APL is unreadable are
because it flies in the face of conventions devised over centuries. Some of
those notations may be arbitrary, but many have a rationale behind them at
least as compelling as anything I've heard about APL.  The particular
choices that Iverson made in pursuit of a superior notation were not the
only ones that could have been made.  Considering how seldom people have
addressed themselves to the same choices, it does not seem justified to
hold APL up as the only correct way to do it merely because it is different.
				/AHM
103.35CIM::JONANMe/Gauss = Cow/CalculusTue Oct 14 1986 18:4970
    Re: .29

>    I believe that it is evident that Ada, Pascal, etc. are very much
>    derived from FORTRAN (procedural, declarations, syntax for loops,
>    if-then-else, scoping, etc.) whereas languages like LISP, APL, SETL,
>    etc. are much easier differentiated from FORTRAN.

    Hmmm, seems to me that (Common) LISP has procedures, declarations,
    syntax for loops, if-then-else, case statement (COND), and scoping
    (yes, lexical scoping too, by default).  It is certainly easy to
    differentiate from FORTRAN!  But not for the characteristics you
    site.

    Example LISP:

    (DEFUN a-proc (x)
        (COND ( (< x a_num)
                    (DO ((counter 0 (+ counter 1)))
                        ((= counter x) (print counter))
                        (setq junk (+ junk 1)))
                    )
              )
              ( (> x a_num)
                    (print "Hello Marion, is that you??)
              )
              ( (= x a_num)
                    (print "FEE FIE FOE FUM")
              )
        )
    )
    


>    P.S. Ironically, FORTRAN 8X (the next FORTRAN standard) has all the 
>    features that you've mentioned to differentiate the other FORGOL
>    languages from FORTRAN.

    Well then, why even bother with it.  Why not just use Pascal??
    If what you say is true, calling the thing FORTRAN is sort of
    ridiculus.  Suppose most of APL's constructs were built into Ada.
    Would you still call the result "Ada"??
    

    Re: .33

>    language?  What do you think your average non-FORGOL person would 
>    make of the following statement?:
>    
>    		A = A + 1

    This is FORTRAN, NOT ALGOL and descendants.  Perhaps A := A + 1
    is not any more obvious, but it certainly is not as confused (or
    confusing)


>>    could be an example. In APL, am I right in saying it evaluates
>>    expressions right to left? This would be an example of what you
>>    need to understand before the language's constructs make sense.
>    
>    Yes, this is true for all languages.  Although as I've said, I think

    NOT true for all languages.  In fact, evaluation from left to right
    is more the norm: a * b / c = (a*b) / c in most languages.  Note
    that in a strictly mathematical setting, such an expression would
    *always* be considered ambiguous (and very poor practice).  However,
    it is true that other operators are right-associative: the booleans
    and the exponentiation operator being the most notable.


    /Jon
103.36Right associative boolean operators?DENTON::AMARTINAlan H. MartinWed Oct 15 1986 12:5315
Re .35:

>    NOT true for all languages.

You misread Dave's statement.  He was asserting that all languages contain
constructs opaque to the outsider, not that all languages have right
associative multiplication and division operators.


>However, it is true that other operators are right-associative: the
>booleans and the exponentiation operator being the most notable.
 ^^^^^^^^

Please give an example of an HAPL with right-associative boolean operators.
				/AHM/THX
103.37LOGIC::VANTREECKWed Oct 15 1986 14:0823
    Re: .32
    
    John, to answer your question about what happened to Backus' language
    FP: there is an interpreter or two written (IBM PC and VAX based ones).
    I think Imperial College (in England) is working on a compiler. FP was
    designed for non-Von Neuman machines and is grossly ineffecient on a
    single processor. On single processors like the VAX, FP is even slower
    than DATATRIEVE!  :-)
    
    FP's major weakness is handling routines that require side-effects,
    e.g., IO routines. What's needed for FP is a massively parallel
    processor (MPP) with a general purpose CPU to handle things like IO --
    similar to the product from Thinking Machines. 
    
    DEC's RAD committee funds a research project that's building a proto
    MPP. I think FP would be an be an ideal langauage to implement on the
    DEC MPP. Like Backus' first language for single processor computers
    (FORTRAN), I believe his second language (FP) might someday have the
    same success on MPPs. I know scientists and engineers doing lots
    of number crunching will love FP -- if there's a suitable processor
    for it to run on.
    
    -George 
103.38People will use tommorrow what they used today coz that's what they used yesterdayDREGS::BLICKSTEINDaveWed Oct 15 1986 17:59117
re: < Note 103.34 by DENTON::AMARTIN "Alan H. Martin" >

>Well, I know a 7th grader who decoded some typical-appearing FORTRAN code
>to perform model rocket simulations without any prior knowledge of
>programming language syntax.  
    
    What is meant here by "decoded".  That's very vague.  I mean, what
    level of understanding did he achieve and can it be determined that
    it was the "semi-intuitive" notation of FORTRAN that helped him achieve
    that understand, or did he just read the comments?
    
>>    For example, can you off the top 
>>    of your head tell what the following expression evaluates to in each 
>>    FORGOL language that you are familiar with:
>>    
>>    	2*1/3/4*5
>>    
>>    An APL student could easily tell you the answer for APL after his first
>>    lesson.
    
>I don't believe it.
    
    I don't believe you don't believe it.  
    
>Your example looks like a classic "trick" question
>for a first exam.  An APL student would be far more likely to blow
>it than a student of honest algebraic procedural languages (HAPLs).  
    
    All I can say is that I find it rather astonishing that you would
    consider a simple right-to-left rule harder to get right than a
    complex set of rules involving precedence and association.

>    Also,
>the trick of using division operations which require knowledge of whether
>that expression involves integers and/or floating point wouldn't be so
>funny if APL had more than one numeric data type, because then that student
>would be in twice as much trouble.
    
    But APL only has one numeric datatype!  You seem to be saying that
    it wouldn't be so easy in APL if APL were more like FORGOL!
    
    C'mon.  If you walk up to the average person on the street and ask
    them "What is the value of (1/10) * 10" do you think they are more
    likely to say 0 or 1, or perhaps one or more of the following:
    
    	1) "Well Al, is the computation supposed to be done using real
           numbers or integers?"
    
        2) 8.75 (PL/I is capable of giving you a number like this under
           certain circumstances due to various typing rules.)
    
    BTW, the APL answer would be 1, which is what *I'd* expect most
    non-FORGOL programmers to give as the answer.

>I'd be surprised if APL programs as a group have a lower incidence of
>missing or misplaced parenthesis errors than HAPL programs, at least
>upon the first attempt at execution.  Especially since individual APL
>expressions will be longer, and will require more parenthesis on the
>    average.
    
    Well then, then you would find MY experience with APL programs surprising.
    The right-to-left rule may be simple, but it's always in mind when
    writing APL expressions because you also compose them that way.  
    Thus such mistakes are very rare.
    
    On the other hand, ommitted parenthesization is fairly common source
    for errors because there the rules are so complex and there are
    so many pitfalls (precedence, associativity, order-of-evaluation,
    datatyping, etc.).


>I think it is quite believable that many claims that APL is unreadable are
>because it flies in the face of conventions devised over centuries. 
    
    Ken Iverson once explained the resistance to
    the new ideas represented in APL with a memorable quote:
    
    	"People will use tommorrow what they used today, because that's
         what they used yesterday."

>    Some of
>those notations may be arbitrary, but many have a rationale behind them at
>least as compelling as anything I've heard about APL.
    
    Such as...???
    
    The rationale behind the APL rule is elegant simplicity.  Easy to
    understand, easy to remember, easy to apply.
    
    I believe the rationale behind operator precedence is to avoid having
    to write parenthesis in some circumstances.  Phooey.
    
    I believe the rationale behind associativity is arbitrary.  It serves
    only to disambiguate expressions without parenthesis in the absence
    of a general ordering rule.
    
    There is no consistent rule for order of evaluation.  If the function
    F returns an integer 1 larger than it returned the last time and
    starts at 1, the value of F() + (some expression) + F() varies widely 
    in FORGOL languages, and in many cases is "undefined" (a nice word
    for being "amibiguous")   
    
>The particular
>choices that Iverson made in pursuit of a superior notation were not the
>only ones that could have been made.  Considering how seldom people have
>addressed themselves to the same choices, it does not seem justified to
>hold APL up as the only correct way to do it merely because it is different.
    
    Would you agree then that it does not seem justified
    to hold APL up as being incorrect merely because it is different?
    That's what many people who say APL is unreadable seem to be saying.
    
    I don't recall saying that APL was the "only correct way", nor
    justifying APL "merely because it is different.  Could you provide
    some references?

      	db
103.39On the use or abuse of symbolsDREGS::BLICKSTEINDaveWed Oct 15 1986 18:039
re: .31
    
>    I'm trying to say perhaps a language should look like what it means,
>    rather than the programmer having to understand what the symbols mean.
    
    Obviously a COBOL fan.    :-)
    
    	db

103.40APL, PROLOG: mathematical?MLOKAI::MACKa(2bThu Oct 16 1986 00:2739
    Someone correct me if I have the wrong impression, but from the samples
    of code in both APL and PROLOG I have seen, they both look very much
    like mathematical notations of some sort. Except for assignment
    statements (which can be kept arbitrarily trivial), typical samples of
    the other languages don't. 

    This impression sets me against both APL and PROLOG from the start. I
    am concerned that any language which uses a form of mathematics as its
    model won't be side-by-side comparable with the real-world problem you
    are trying to model, since mathematical methods tend to postulate
    arbitrary but useful abstractions which disguise the real-world
    objects.  This makes sanity checking against intuition impossible, and
    sanity checking is too good a tool to give up. 

    As an example of this characteristic of mathematics:
    
    In the middle of a pile of eigenvectors which supposedly represent the
    behavior of a train running down a track, I can't check my work at each
    step by saying "Now, does this make intuitive sense, or did I make a
    mistake somewhere?"  Eventually, I come up with a final value that
    tells me that the train is travelling at 700 mph.  Then I know that I
    made a mistake somewhere in the middle of my calculations and have to
    backtrack.  I prefer an approach that lets me weigh the results at each
    step, but in some engineering problems, this kind of math is the "only
    game in town".  At least in software I have a choice.
    
    The questions "What does this really mean?  What does that really
    mean?" seem to enfuriate math-oriented people.  Asking them is a
    guarantee of being treated like an obstinate child.
    
    The end result of these observations is that I don't trust human reason
    much, I find math a poor tool because I can't check my reasoning at
    arbitrary points by comparison to common sense, and so I try to stay as
    clear of math as I can get.  How possible is it to do this in APL or
    PROLOG?  On the broader sweep, what are my chances of staying clear of
    it over the next 30 years or so and still being on the cutting edge of
    software engineering? 
    
    							Ralph 
103.41Don't know COBOLNOGOV::GOODWINPancake seated; Tree watchingThu Oct 16 1986 04:4935
    Re: .39
    
    Sorry! I don't know any COBOL! :-) I think I was trying to say:
    
    A:=1;
    
    is clearer than
    
    1 A !
    
    unless, of course, you know FORTH. If you do know FORTH, 'A' could
    be ambiguous - does it mean address of variable A, hex number A,
    or a constant value A? To work that out you would need
    
    0 VARIABLE A
    
    before to clarify it. Just to add to confusion, it's
    
    VARIABLE A
    
    in the FORTH-83 standard.
    
    Another point about FORTH is that it uses Reverse Polish Notation,
    so, in algebraic terms
    
    4 * 4 + 5 * 6
    
    becomes
    
    4 4 * 5 6 * +
    
    so, like APL, it has very simple rules dispensing with parentheses
    and operator precedence.
    
    Pete.
103.42Symbols aren't evilDREGS::BLICKSTEINDaveThu Oct 16 1986 10:3512
    re: .41
    
    I was kidding.  

    COBOL tried to place using symbols with English in order to make
    programs more understandable, but I think that the majority opinion
    is that the excess verbosity makes COBOL hard to read.
    
    I guess if there's a point in my joke it's that one lesson of COBOL
    is that the use of symbols can make a notation EASIER to read.
    
    	db
103.43reply to .40DREGS::BLICKSTEINDaveThu Oct 16 1986 10:4721
    re: .40
    
    That's a very good observation, and my experience in APL tells me
    that you are quite right that the lack of intermediate steps in
    APL sometimes makes it harder to figure out where the problem is.
    
    It's also compounded in APL by the fact that APL is so lax in its
    syntax and the domain of its functions that often the calculation
    is allowed to continue way past the point of error before something
    happens such that APL complains.
    
    This is why the standard APL coding method is to construct expressions
    a little bit at a time and verifying the results at each "intermediate"
    step.
    
    APL is a mathematically oriented notation, and thus
    is most appropriate for problems that can be efficiently modelled 
    mathematically.  I personally think that is a very large domain of 
    problems.
    
    	db
103.44CIM::JONANMe/Gauss = Cow/CalculusThu Oct 16 1986 12:56100
    Re: AHM - right associative booleans.

    p AND q AND r = p AND (q AND r).  :-)

    Seriously, I believe p OR q AND r = p OR (q AND r) is true in MODULA-2.
    I think Wirth claims this is the "usual" interpretation.  But, I could be
    wrong about this.


    Re: .40           -< APL, PROLOG: mathematical? >-


>    This impression sets me against both APL and PROLOG from the start. I
>    am concerned that any language which uses a form of mathematics as its
>    model won't be side-by-side comparable with the real-world problem you
>    are trying to model, since mathematical methods tend to postulate
>    arbitrary but useful abstractions which disguise the real-world
>    objects.  This makes sanity checking against intuition impossible, and
>    sanity checking is too good a tool to give up.

    Abstractions yes; arbitrary? - almost never.  I'm not sure what you
    mean by "side-by-side" here.  Clearly, mathematical descriptions have been
    *extraordinarily* comparable to "real-world" problems - just ask any
    physicist (,chemist, geneticist, computer scientist, and even many
    social scientists).  Any abstraction process gains in generality what it
    loses in specifics - the purpose and result of which is not to "disguise"
    but to gain insight and *increase one's intuitive capabilities*.  If
    your intuition isn't "up to" the problem domain it isn't going to do
    you much good.  In fact, it will probably be an impediment.  My favorite
    example of this is Special Relativity.  Unless he's a genius, the
    intuitions of the uninitiated in this area will only serve to confuse
    him.  You just don't get a good *intuitive* understanding of what it's
    all about until you work through the mathematics (maybe even several
    times...).  Once you do, however, the concepts really do become
    "intuitive" (maybe not obvious, but...).

    
>    In the middle of a pile of eigenvectors which supposedly represent the
>    behavior of a train running down a track, I can't check my work at each
>    step by saying "Now, does this make intuitive sense, or did I make a
>    mistake somewhere?"  Eventually, I come up with a final value that
>    tells me that the train is travelling at 700 mph.  Then I know that I
>    made a mistake somewhere in the middle of my calculations and have to
>    backtrack.

    You're describing a problem with the *application* of a mathematical
    model not with the model.  All this means is that you are not a very good
    "cookbook" scientist (perhaps not a very good mathematician either,
    but, in this case, that's besides the point).
    

>    The questions "What does this really mean?  What does that really
>    mean?" seem to enfuriate math-oriented people.  Asking them is a
>    guarantee of being treated like an obstinate child.


    I can't speak for others, but *I* don't get infuriated.  Frustrated,
    yes, at times, but *never* infuriated :-).  Of course, if the person
    is just reciting "mathematical recipes", perhaps he doesn't know what
    it "really" means either....
    

>    ... I find math a poor tool because I can't check my reasoning at
>    arbitrary points by comparison to common sense, and so I try to stay as
>    clear of math as I can get.

    As I alluded earlier, mathematics helps you in at least three ways with
    respect to intuition/common sense.  First, if you're dealing with an area
    you don't have much insight in, you can maybe just use the work done by
    others to achieve your goals (without understanding it).  This is a
    very useful, tried and true technique in engineering.  Second, it can
    act as a check on your intuitions on subtle points.  Third, and most
    important, you will increase/hone your intuitions in a given area if you
    assimilate the relevant mathematics.
    

>    ...  what are my chances of staying clear of
>    it over the next 30 years or so and still being on the cutting edge of
>    software engineering?      

    Not good.

      
    Re: .43

    Perhaps this "step skipping" look, is what people are refering to when
    they say that APL is unreadable.  I don't know a great deal about APL,
    but perhaps APL programs should mimic the presentation style of
    mathematical works.  Generally, this follows the "formula" of a good
    deal of text (comments...), augmented with pictures (graphical comments?)
    followed by and interspersed with lines of relevant notations.  The
    notion that mathematical works are just volumes of symbolic notation
    is a false one (though there *are* exceptions: Russell and Whitehead's
    Principia for example).  Discussions, proofs, theorems, etc, are always
    presented in (natural) language with notation being used to present
    and make clear concepts and what not that would otherwise be to verbose
    to express.

    /Jon
103.45LOGIC::VANTREECKThu Oct 16 1986 13:3029
    re: .40
    
    You are right to some extent about the mathematical nature of some
    languages making it more difficult to sometimes do real world things.
    This is due to the fact that mathematically oriented languages tend
    to be more declarative than procedural. Thus, it becomes necessary
    to corrupt the mathematical purity of the language by adding procedural
    semantics in order to get real work done.
    
    If you want to check against reality (some arbitrary conditions) while
    doing some vector calculation in PROLOG, it is quite easy (it's been
    sufficiently corrupted). For example, suppose "scalar_product" returns
    all vector elements , _z, where each element of _z is the product of a
    value, _x, and the vector elements, _y, only if _y has no zero
    elements. If any element is equal to 0, then don't return any values of
    _z. In PROLOG, a procedure that computes scalar products with the above
    restrictions can be expressed as: 
    
    	scalar_product(_x, _y, _z) :-
    	   _z is _x * _y,
    	   ( (_z = 0, !, retract(sp(_))) ; (assert(sp(_z))) ), fail.
    	scalar_product(_x, _y, _z) :- sp(_z).
    
    The "!" , "retract", "assert", and "fail" are procedural hacks. Sample
    code that calls scalar_product, returning all vector elements, _y: 
    
    	... a(_x), scalar_product(10, _x, _y), ...
    
    -George
103.46Let's start another train of thought...MINAR::BISHOPThu Oct 16 1986 17:1614
    Re .45:
    
    Other things being equal (which they usually are not), a declarative
    language is easier to specify the semantics of, easier to analyze
    for potential parallelism, easier to generate code for, and easier
    to prove correct (if that should happen to be important for you).
    
    This is because a declarative language is free-er of side-effects.

    There are side-effect free procedural languages (SISAL is one),
    and they share the benefits above.  Does anyone have things to say
    about such languages?
    
    				-John Bishop
103.47Try a language with .IMP.NOBUGS::AMARTINAlan H. MartinThu Oct 16 1986 17:3230
Re .44:

>    Seriously, I believe p OR q AND r = p OR (q AND r) is true in MODULA-2.
>    I think Wirth claims this is the "usual" interpretation.  But, I could be
>    wrong about this.

Since AND traditionally has a higher precedence than OR, it is not
surprising that "p OR q AND r = p OR (q AND r)".  While Wirth blew the
relative precedence of arithmetic, relational and boolean operators in
Pascal, at least he got AND and OR right between themselves.

The right vs. left vs. non associative nature of an operator is only
relevant when comparing it with another operator of the same precedence.
And if the order of evaluation of an operator's operands is not defined,
it only makes a big difference in the semantics if one of the operators is
not commutative.  (I'm ignoring screws like the potential for rounding
error in 1.0E30+1.0-1.0E30, and functions with side-effects).


Oh, yeah:

Re .35

>    Note that in a strictly mathematical setting, [a * b / c] would
>    *always* be considered ambiguous (and very poor practice).

Nah.  Give two different mathematical interpretations (parses) of a*b/c.
I would believe you it you said a/b*c.
				/AHM/THX
P. S.  Extra credit: give an example of a non-associative operator.
103.48Clarification: play-by-playMLOKAI::MACKa(2bThu Oct 16 1986 18:2036
    Re .44:

    Perhaps I can clarify "play-by-play" with a little personal history--
    
    I went to UConn and got an engineering degree hoping to really
    understand electronics.  (My CS degree was in the EE department -- I
    graduated three courses short of a dual EECS degree.)
    
    I felt good about Calculus: derivatives were used to quantify change of
    various kinds and integrals were used to "build rocks from dust". But
    my heart dropped when they introduced me to Laplace Transforms. My
    transistor radio and I both live entirely in the time domain. But the
    engineering use of the Laplace transform manufactures a purely
    hypothetical "frequency domain" and then does all the calculations in
    it. 
    
    The result: my calculations, while they are very useful in designing my
    transistor radio, can't give me a cycle- by-cycle picture of the
    behavior of the electrical field inside my transistor radio, and this
    is what I really wanted to know.  I ran into the same frustration
    two years later in antenna theory.
    
    Just recently, I got a book by Robert Glorioso and somebody else,
    called _Engineering Intelligent Systems_.  It describes the design of
    intelligent systems using stochastic automata theory.  With a little
    work, I could become quite facile in working the vector equations, but
    reading his math does very little in helping me to picture and thereby
    understand the state-by-state changes in the system.  
    
    When I debug a program, I am very much dependent upon this kind of
    "heart-knowledge" to help me evaluate my results.  However methodical I
    become, all methods have holes, and all human performance of methods is
    inconsistent, and this "heart-knowledge" is the surest way of filling
    in the gaps. 
    
    								Ralph
103.49Nonassociative? No sweat.DSSDEV::ALDENKen AldenThu Oct 16 1986 21:1710
    re .47
    
    Of course, division isn't associative:
    
            (3/5)/5  <>  3/(5/5)   with real division.
    
    If you mean commutative but not associative, how about distance
    on the real line; i.e., f(a,b) = |a-b| .  An example could be
    
            | |3-5| - 7 |  <>  | 3 - |5-7| |   .
103.50This note needs no titleCIM::JONANMe/Gauss = Cow/CalculusFri Oct 17 1986 00:3324
    Re: .47

> Nah.  Give two different mathematical interpretations (parses) of a*b/c.
> I would believe you it you said a/b*c.

    Oops, you're right.  Wouldn't ya know, of all the obvious 
    possibilities, I write down about the only one that *doesn't* exemplify
    what I meant.  Typical...

> P. S.  Extra credit: give an example of a non-associative operator.

    Three obvious ones:

    1. Division operator
    2. Exponentiation operator
    3. Cross Product

    Another for the fun of it: Define ab, a & b integers, as 2a + 3b
    There are many others.  Algebras involving non-associative operators
    are very peculiar (and very hard to work in).

/Jon
    
    
103.51QUARK::LIONELReality is frequently inaccurateFri Oct 17 1986 10:3913
    Re: "associative operator"
    
    Funny, with all the discussion of left association and right
    association in past replies, why does everyone start thinking
    about asociative operations?  I'd say that a non-associative
    operator is unary minus, but what do I know?
    					Steve
    
    P.S.  I always liked SNOBOL4 for your ability to bind functions
    to operators as you choose - there were a bunch of operators to
    choose from, some normally unused, with different precedence
    and association properties.
    
103.52Going the wrong way on a one way streetDREGS::BLICKSTEINDaveFri Oct 17 1986 10:5344

>    Perhaps this "step skipping" look, is what people are refering to when
>    they say that APL is unreadable.  

    Sorry, I accept the argument that it makes it harder to debug but
    see little evidence that it makes it less readable.  In fact, this
    is why I believe it is more readable.
    
    Which is more immediately understandable:
    
    DESCRIPTION 1
    
    Prolog: SUM and I represent integer values.
    
    Step 1.  Set SUM to 0.
    
    Step 2.  Set I to 0.
    
    Step 3.  Set I to I+1
    
    Step 4.  Add array element I to S
    
    Step 5.  If I is equal to the size of the array, go back to step
             3.
    
    Step 6.  Divide SUM by the size of the array.
    
    DESCRIPTION 2
    
    Step 1.  Sum an array
    
    Step 2.  Divide the sum by the size of the array.

    I think it the second description is much easier recognized as being
    a computation of average then the first, even though the first provides
    more intermediate steps.
    
    I guess you COULD say, that step skipping hides complexity, and
    there are circumstances where that complexity must be understood.
    But I think you'd be on the wrong track attempting to criticize
    APL for readability because it hides irrelevant complexity.
    
    	db
103.53My 2 cents on APLQUARK::LIONELReality is frequently inaccurateFri Oct 17 1986 12:1214
    I like APL, and always have.  The reason many people think APL
    is unreadable is because it tends to encourage APL Wizards to cram
    as much onto one line as they can.  The result is something that
    may indeed work, but is definitely unreadable except by other
    wizards.
    
    You can write very readable APL just as easily as writing
    unreadable FORTRAN.  But few APL programmers bother to do so (and
    even fewer bother with comments - knowing that comments sometimes
    slow down the program).
    
    To me, APL is "just" a top-notch 3rd generation language - one that
    has very simple, elegant and powerful constructs.
    					Steve
103.54LOGIC::VANTREECKFri Oct 17 1986 12:266
    RE: .53
    
    I have a younger brother that's an APL wizard. He has trouble
    reading his own 2 week old code....
    
    -George
103.55LOGIC::VANTREECKFri Oct 17 1986 12:3715
    re: .46
    
    >There are side-effect free procedural languages (SISAL is one),
    >and they share the benefits above.  Does anyone have things to say
    >about such languages?
    
    Do you consider destructive assignment a side-effect? If it's not a
    side-effect, then it's another feature of a language that makes
    parallel execution more difficult. For example, several VAX CPU sharing
    global memory, trying to assign to the same variable at the same time
    may cause undesireable results. Destructive assignment also makes doing
    proofs of correctness a little more difficult. Perhaps lack of
    destructive assignment is another attribute of a high-level language?
    
    -George
103.56DREGS::BLICKSTEINDaveFri Oct 17 1986 13:0623
re: < Note 103.53 by QUARK::LIONEL "Reality is frequently inaccurate" >

>    You can write very readable APL just as easily as writing
>    unreadable FORTRAN.  But few APL programmers bother to do so (and
>    even fewer bother with comments - knowing that comments sometimes
>    slow down the program).
    
    Most of the time when you hear this comment, it is based on one's
    experience reading throw-away APL programs done by college students.
    
    I've worked as an consultant with a specialty in APL, and it has
    been my uniform experience that shops that use APL have very high
    standards regarding commenting practices.

    You may also see more hard-to-read APL programs because coding in
    APL is so efficient (in terms of programmer time) that there are
    throw-away programs that wouldn't be worth the time to write required
    by FORGOL that are easily whipped up in APL.  I.e., there are a
    higher percentage of throwawy APL programs than FORGOL because FORGOL
    requires a larger investment of time.   A sad contributing factor
    is that "throw-away" are often not thrown away.
    
    	db
103.57.51 came the closestNOBUGS::AMARTINAlan H. MartinFri Oct 17 1986 15:0946
Re .49,.50:

Sorry, no.  I meant non-associative in a syntactic, not algebraic, sense.
(See below).  I am sorry if I confused you.

Re .51:

You have the right definition of the word.  Unfortunately, for obscure or
evil reasons, unary minus is at least sometimes considered right
associative (consider the Digital "The C Language" poster).  For example,
"- - x" evaluates the right hand negation first. Now, in fact that would
occur without defining negation as right associative because of the need to
evaluate all operands of an operator before evaluating the operator on
them.  In the case of something like "++x--", it does allow you to assign
the parse "++(x--)" (which is semantically wrong because x-- isn't an
lvalue, but I'll ignore that).

However, in a language which had operators which operated on other
operators (functors(?)), I can imagine a unary functor FOO with the same
precedence as a unary operator on numbers named BAR.  In that case, it
would be important to know whether FOO BAR 1.0 meant FOO(BAR 1.0) or (FOO
BAR) 1.0.  That is, whether to operate on the operator BAR (or the
expression BAR 1.0) with FOO, or whether to apply BAR to 1.0 and apply FOO
to the result.

At least I think that is a valid rationalization for giving an
associativity for unary operators.  I could be wrong.


What I was thinking of is mentioned in some explanation of yacc.  Yacc
lets you designate operators as being non-associative.  The Ultrix manual
set description of yacc uses the relational operators as an example. 
Consider the very flat (and ambiguous) grammar:

E :== E op E | id
op :== + | * | < | AND | OR

Specifying that < is non-associative allows the parser to syntactically
disallow bogus expressions like "id < id < id" by saying that +, *, AND and
OR are left associative, and < is non-associative.  I realize that in a few
languages, this expression has a meaning, and in even fewer the meaning is
the intended one.  However, in some implementations of other languages, it
normally ends up being syntactically correct, and semantically incorrect. 

Sorry if this brain-teaser went off the deep end.
				/AHM
103.58Familiarity breeds respect...MLOKAI::MACKa(2bFri Oct 17 1986 15:4220
    In order to be fair in these discussions, I've installed APL 2.1 on one
    of our systems and am busy (outside of business hours) learning it. In
    scanning through the text I am using (_APL -- An Interactive Approach_
    by Gilman & Rose, 1975), it seems like APL deals only with pure numbers
    and (somewhat) strings.  Is this true?  What about abstract data
    objects and all that kind of stuff?  (Or is that kind of design/
    programming purely a concern made necessary by the "inadequacies" of
    FORGOL?) 

    Even at this brief reading, APL appears to be ideally suited for direct
    implementation of mathematical models, so it could legitimately qualify
    as a fourth-generation language for problems which can be easily
    represented mathematically.  But so far I wouldn't want to write, say,
    a large CAD system in it.  Perhaps I'll change my mind as I read
    further; we'll see. 
    
    Side question:  What is the best tutorial for learning APL as it is
    really used in the industry? 
    
    							Ralph
103.59A Snobol's chance in hell of parsing correctlyNOBUGS::AMARTINAlan H. MartinFri Oct 17 1986 17:1116
Re .51:

Oh yeah, I forget to address your postscript about Snobol.

Randy Meyers was telling me yesterday that he had dealt with a Snobol
implementation which provided two user-defined operators (call they
~ and ^, who knows what they really were) which had the same precedence,
and opposite associativities (say, left and right, respectively).

Well, it seems you shouldn't be able to make ~ and ^ adjacent in
expressions, though the implementation let you.  You see, the expression
X ~ Y ^ Z should have NO legal parse, and A ^ B ~ C should have TWO
parses (be ambiguous).  As it turned out, the left associative operator
had a slightly higher precedence, so you actually could combine the
operators, but that was an aspect which was contrary to documentation.
				/AHM
103.60Data abstractionDREGS::BLICKSTEINDaveFri Oct 17 1986 17:1537
    re: .58
    
    APL does deal only with pure numbers and characters (in arrays
    or as scalars).
    
    You've got what is widely considered to be the best tutorial (Gilman
    & Rose), except that there is a newer edition that covers nested arrays.

>    (Or is that kind of design/
>    programming purely a concern made necessary by the "inadequacies" of
>    FORGOL?) 
    
    I hope your use of quotation marks isn't meant to imply that I have
    said or indicated that FORGOL languages are "inadequate".  I've
    never said, thought, nor implied that.
    
    However, rather than describe it as an "inadequacy", I would just
    say that abstract datatypes would be inappropriate in a language
    that doesn't really have datatypes to begin with.  To say APL is
    loosely typed is akin to saying the Lyndon Larouche is "a little
    crazy".
    
    If I was a radical APL fanatic (I'm not you know), I might say that
    this is like saying that APL is inadequate because it doesn't have
    a syntax for loops (APL doesn't have syntax period.)  On the other
    hand I recognize the value of data abstraction and this is a valuable
    capability that you lose when you eliminate typing.
    However, until Ada, I don't think data abstraction has been attainable
    to any significant degree.
    
    I'd be interested to hear of some non-Ada examples of data abstraction.
    So far as I know, Ada is the only FOPGOL language in wide use that allows
    you to both define abstract datatypes and operators to apply to
    them.  Others only allow you to attach a name to them and have some
    type-checking done automatically.
        
    	db
103.61recursive comments (i.e., comments on...)KALKIN::BUTENHOFApproachable SystemsSat Oct 18 1986 17:1848
        re .53 and .56 (APL comments)...
        
        My wife took an APL course in college.  The instructor
        subtracted points from the grade of each program handed in
        with "too many" comments (i.e., more than a one liner or
        so at the top of the program).  If this sort of attitude
        is at all common, it'd be no great wonder that well commented
        APL programs might be rare!
        
        The VMS timer module (VAX Macro), at least as of V2, had
        only one comment, in the module header: "Does anybody really
        know what time it is?"  UNIX code (C) is virtually
        uncommented... even the full documentation sets read as what
        most people would consider relatively terse code comments.
        
        TECO shares many of the problems of APL... a highly terse
        and (within its domain) highly powerful interpreted language
        which in the name of efficiency and pride-of-wizardry encourages
        uncommented and highly compressed code.  To the point where
        even a true expert can find it virtually impossible to unravel
        (even when said expert was in fact the recent author of the
        code).
        
        But then, one can do this with most any "FORGOL" language,
        too... are we criticizing languages or programmers, after all?
        Any language which provides a facility for comments gives all
        the tools necessary for writing programs which are highly
        legible to virtually anyone.  If nothing else, one can make the
        code be hieroglyphic interludes between the paragraphs of a
        detailed natural language design specification (e.g., Knuth's
        WEB).  The fact that few programmers (in any language) make
        full use of this capability can hardly be considered a fault
        of the language.  Note that even for interpreted languages
        (APL, TECO, DCL, etc.) it's quite reasonable to have a
        "compiler" which removes nonessentials such as spacing and
        comments before it's actually used, just as compiled languages
        have a source and an object format.  These "compilers" are
        quite common for TECO and DCL (in fact, VMS releases "compiled"
        versions of DCL programs such as VMSINSTAL), and I wouldn't
        be surprized to find them for APL as well.
        
        By the way, I object strongly to the term "FORGOL"... but
        only because it lumps ALGOL class languages (which I rather
        like) in with FORTRASH, which I hate.  While I dislike the
        term, the concept is quite valid.  And even the name (sigh)
        is rather catchy...  :-)
        
        	/dave
103.62FORTRASH vs ALGARBAGE ?CHOVAX::YOUNGDr. Memory...Sat Oct 18 1986 21:511
    (Please imagine a BIG smiley face here!)
103.63The term is meant to distinguish ALGOL from FORTRANDREGS::BLICKSTEINDaveMon Oct 20 1986 12:4117
    re: .30 and .61
    
    I hadn't planned on mentioning this, but since I've now recieved
    two comments about it:

    The term "FORGOL" was specificly chosen to distinguish that two
    distinguished sub-trees of the same overall family were being
    referred using a derived name for the family.
    
    Otherwise, I might  have called it FORCAL (FOR/PAS), etc. instead
    a term intended to suggest FORTRAN/ALGOL which is meant to imply
    languages that are related to FORTRAN or ALGOL.
    
    As I've said, the term isn't meant to be derogatory, although it
    is admittedly flawed in that most people interpret it that way.
    
    	db
103.64SW Development vs. Application DevelopmentSOFBAS::ROSCHWed Jun 03 1987 14:366
    Speaking of 'high-level' languages...
    How would you define the difference between a 'software development'
    language and an 'application development' language? And if you could
    then what's a 'high level' software development and a 'high level'
    application development language you favor the most?