[Search for users] [Overall Top Noters] [List of all Conferences] [Download this site]

Conference turris::languages

Title:Languages
Notice:Speaking In Tongues
Moderator:TLE::TOKLAS::FELDMAN
Created:Sat Jan 25 1986
Last Modified:Wed May 21 1997
Last Successful Update:Fri Jun 06 1997
Number of topics:394
Total number of notes:2683

87.0. "Why do people like different languages?" by MLOKAI::MACK (It's the real world after all) Mon Apr 07 1986 11:01

    As this conference hasn't been too active lately, I'd like to generate
    some discussion between the "C people", the "ADA people", and whoever
    else wants to join, on why people prefer different programming styles.
    
    As an initial point of discussion:  What is a programming language?
    
    Traditionally, we have expected the language to do two things: 
    (1)	Be a shorthand for generating machine code that runs right.
    (2) Be a shorthand for communicating with other programmers what the
        machine code is supposed to do. 
    You "connive" with the compiler to get it to generate the machine
    code you want, "going over its head" whenever necessary.
    
    More recently, there has been talk about compilers as "virtual
    high-level machines".  This concept suggests that the language is
    truly a language to communicate with the compiler, who communicates
    with the actual computer system.  You tell the compiler precisely
    what you want the program to accomplish and how, and expect it to 
    communicate this accurately to the system.

    The two approaches have different problems.  With the first (C)
    approach, you expect to be dodging in and out of the compiler and going
    around it.  You have the responsibility of making sure all of these
    movements don't collide with each other.  With the second (Ada)
    approach, you have the responsibility of communicating everything
    you are trying to do to the language, so it won't go running off
    acting on incomplete information. 

    An Ada program is like a contract.  A C program is like a breadboard.
    I suspect that your language of choice will depend largely on whether
    you prefer the "crosstalk" problem of C or the "clear unambiguous
    communication" problem of Ada.  
    
    This model seems simplistic, but then architectural models always
    are.  Does somebody have a better or alternative model to suggest, or
    any extensions to this one?  I am trying to get a handle on why
    people prefer different language styles.

    						Ralph
T.RTitleUserPersonal
Name
DateLines
87.1An Ada vote from a C programmerKATIE::COOKNeilMon Apr 07 1986 19:2812
I am programming totally in C. A large proportion of my
programming errors are in parameter passing. They are
normally typing errors or arise from an assumption about
the way a parameter is passed. I.e. "This is an input-
only parameter therefore it must be passed using one
level of indirection, since it is a structure." Of
course, it inevitably is passed using two levels of
indirection because "of PL/I compatibility", "all the
similar routines do that", "it's more consistant" etc.

I belive in strong type checking in all large projects -
that is more than one programmer!
87.2Are we victims of "TECNO-SNOBBERY"CSTVAX::MCLUREDavid McLureMon Apr 07 1986 20:0430
    	I hate to say it, but there seems to be a certain element of
    snobbery surrounding the sorts of decisions one goes through in
    selecting a language.

    	Has anyone ever wondered why the thought of programming in certain
    languages tends to inspire great auras of wisdom and beauty, while
    programming in other languages inspires nothing more than yawns and
    snickers?

    	Maybe I'm off-base here, but I think I could find enough people
    who would sympathize with my tendancy to cringe at the thought of having
    to write another line of code in Basic; while the same machine, running
    identical algorithms, with the same I/O and CPU performance, and equal
    (if possible) quality documentation written in Pascal (for example)
    would probably be considered more worthy a program.

    	This doesn't necessarily correspond to the level of a language
    (i.e. high-level languages vs. low-level), because that's like trying
    to compare apples and oranges; issues such as performance vs. portability
    and maintenance immediately come to mind.  Instead, it seems that
    somewhere in the search for one's own programming niche, just enough
    ethnocentrism creeps into the thought processes to sufficiently
    predjudice one's view of other languages.


    					-DAV0

    p.s.  I see there is a (.1) reply to this note already, keep in mind I'm
	not pointing any fingers here, merely trying to stir up discussion.
87.3Commercial reasonsSWIFT::BYRNEWe apologise for the inconvenienceTue Apr 08 1986 04:2816
	There is a reply in another notesfile about the dying art of
	reading. It seems that .-1 didn't look at the existing replies
	before replying himself. That is not meant to detract from the
	content of the reply.

	The choice of programming language is usually made on a commercial
	basis. Consider a programmer who has spent all his spare time
	learning macro. A worthy deed but in a company where all coding
	is done in COBOL the programmer would probably be prevented from
	coding in his newly-learnt macro. The reasons for this are sound:
	nobody else in the company knows macro so nobody can maintain the
	code when the programmer in question is off sick, when he leaves
	the company the employer will have to pay twice as much to hire a
	macro programmer.

	Program code is a large investment and as such should be protected.
87.4Pardonez-Moi!CSTVAX::MCLUREDavid McLureTue Apr 08 1986 18:3343
>	There is a reply in another notesfile about the dying art of
>	reading. It seems that .-1 didn't look at the existing replies
>	before replying himself. That is not meant to detract from the
>	content of the reply.

		Speaking of the "dying art of reading", I did mention
	in my post script that someone else had managed to slip a reply
	in before mine.  I'll admit that my replies have been a little long-
	winded and I'll have to work on shortening them to respond faster.

		As far as whether the reply I made has anything to do with
	ADA or C, it doesn't specifically.  However, the original note did
	invite "..whoever else wants to join.." to discuss "..why people
	prefer different programming styles." and I was merely using the
	Basic vs. Pascal example to illustrate a point.

		If there is some reason why these replys should be limited
	to ADA vs. C replies, then by all means proceed.  Keep in mind, however,
	that not everyone interested in the general topic of "why do people
	like different languages?" are fortunate enough to be heavily involved
	in ADA or C coding at the momment.  As long as the comparison of ADA
	and C is kept to a conceptual level, I think that this note has some
	real potential for everyone involved.

		To respond further to -.1 concerning the choice of programming
	language being made strictly on a commercial basis of what the majority
	of the company uses, I think that is a realistic, but somewhat
	defeatist approach.  To explain why, I will use the analogy of the
	IBM sales pitch of "Why buy X (Digital for example) when you have so
	much already invested in IBM and the conversion would be too costly."
	
		In other words, the commercial aspect of choosing a language
	will always play a major role in the decision, just as economics play
	a major role in almost every decision we make, but let's assume for
	the purposes of discussion, that you are given all the money you need
	for software development on a given project (you just won Mega-Bucks)
	and you then need to decide on a language to use.  Once the survival
	level (speaking in terms of Maslow's pyramid) has been satified, then
	what are some of the hierarchical needs that must satisfied in the
	quest for the ideal language?

						-DAV0
87.5Clear model is importantTLE::BISHOPBTue Apr 08 1986 22:0214
    I use BLISS at work, and enjoy its power, but when I want to write
    some little program I use Pascal.  In BLISS, when I take the time
    to do so, I can set up the equivalents of enumerated types and typed
    data structures, and there are things I can do that Pascal will
    not allow (valued IFs, for example, and untyped pointers).  In Pascal
    I don't have to work to set up, but I'm always going back up to
    the top of the program to add some new declarations.
    
    But in both languages the "virtual machine" of the language is clear
    to me (at least those parts I use), and the conventions I follow
    make the code comprehensible to me when I come back to it a week
    later.  This was not true of FORTRAN when I used it as part of a
    compiler project in graduate school.  It is not true of APL or MACRO
    for me, either.
87.6Luddites?SWIFT::BYRNEWe apologise for the inconvenienceWed Apr 09 1986 04:2811
	Sorry David, .3 wasn't meant to be a flame.

	The business of selecting a programming language often works on
	the Luddite principle, "We've always done it like this, why
	should we change now?". The original reasons for selecting the
	current language probably still hold true but the language used
	for the next project should be considered as part of the systems
	analysis project.

	Incidentally, how many people have actually written a program (in
	any language) from scratch i.e. an empty edit file.
87.7Why I like APLTLE::BLICKSTEINDaveWed Apr 09 1986 10:2140
    I like APL because the essence of the language is notation.  In
    APL one strives to describe what the answer is rather than the
    algorithm for computing it.  
    
    APL has certain limitations that prevent one from obtaining this goal
    in any ultimate form, however even in "current" APL what would be
    "flow" in most languages is substituted by notation in APL.  The
    nested array extensions allow APL to take a giant step towards the
    ultimate goal.  In fact, with derived functions and user-defined
    operators (which magically became well-defined with nested arrays)
    it will almost never be necessary to write a clear APL program with anything
    more than simple linear flow (no loops).
    
    I reject the criticism that APL is unreadable.  This typically comes
    from people who just haven't attempted to learn to read APL.  I
    put as much weight into this criticism as that same person might
    put into an english-speaking person saying that Hebrew is unreadable.
    In fact, I find APL more readable than most other languages because
    it condenses in one line what might take several pages of FORGOL
    (my term for all the algorithmic languages derived from FORTRAN
    (Ada, C, Algol, Pascal, PL/I, etc.)).   And as I've mentioned, it
    substitutes notation for flow.
    
    I disagree with John's claim that it is hard to produce a "virtual
    language" on top of APL.   To the contrary, I think it's much easier.
    This is because APL is (similar to LISP) almost a completely functional
    type language, and it has very little syntax.  There are basically
    values (constants, variables, etc.), primative functions and
    operators.   In nested array implementations, one can write functions
    and operators that behave exactly as if they were part of the base
    language.  It would indeed be hard to make APL look like Pascal,
    but it would be easy to make APL look like a different functional
    language.
    
    Important point: I'm not saying that APL is suitable for everything
    (far from it), nor that APL is the "best" language (a pointless
    debate IMO).  I do think that people misunderstand it and because
    of that, often fail to see its strengths.
    
    	db
87.8De gustibus non disputandemTLE::BISHOPBWed Apr 09 1986 11:5121
    I didn't mean to say "good = easy to make virtual languages".  I
    was trying to say "good for me = I understand the virtual machine
    offered by the language".
    
    I empathize with Dave's praise of APL.  I think that the future
    of computer programming lies more in the direction of APL than of
    Pascal, as I think that most programmers are going to be telling
    a system _what_ to do rather than _how_.  But I find it easier to
    visualize what's going on in BLISS, perhaps only because I have
    written so much BLISS and so little APL.  I have tried to understand
    APL, but find it takes a lot of explaination and working of examples
    to understand what that one short line means.  Dyadic transpose,
    for example, is still a mystery, even though I have used it
    successfully, as I have no mental model of what it does.
    
    Of course, the response is "APL is denser than FORGOL, and a FORGOL
    program that did the same amount of work would take just as much
    figuring out (or more)".  I'm afraid we are in violent agreement
    in this area, Dave, except for the matter of personal taste.

       				-John Bishop
87.9How virtual Bliss? Elegance?MLOKAI::MACKIt's the real world after allWed Apr 09 1986 13:2622
Re .8 et al:
    
    How much do you find yourselves thinking of performance and byte-level
    stuff on the target machine when you are coding in, say, Bliss vs.
    Pascal?  
    
    I find that in Ada it is easy to entirely forget about the target
    machine, while in Bliss, I am constantly thinking about DEBUG, words
    and longwords and addresses (all of which are really implementation
    details, divorced from the conceptual space of the problem), even when
    I am doing a fairly high-level application. 
    
    In this sense, I find C easier to use than Bliss, that I can forget
    about the VAX at least for a couple of moments in C, where I have to
    always keep it in mind for Bliss.  But in Ada, I can forget it almost
    entirely until I absolutely need to think about it.
    
    What about elegance?  How do people feel about that?  Important?
    Unimportant?  I would define Modula-2 and C as elegant and powerful,
    ADA and PL/1 as unelegant but powerful.  Comments? 
    
    						Ralph
87.10Why would anyone want to use an HLL?TLE::BLICKSTEINDaveWed Apr 09 1986 17:4617
    Whenever I'm involved in a discussion of the various aversions people
    develop towards APL I'm inevitably reminded of a remark Ken Iverson
    (author of APL) made to me:
    
    We were discussing how to get people to ignore the various myths
    and legends of APL and get them to examine the language themselves.
    After the discussion, there was a pause and he said something that
    has stuck with me ever since.
    
    He said, "If you think it's hard trying to get people to put aside
    their pre-conceived notions about APL, you should'a been around
    IBM when I was working on my thesis and Jim Backus was down the
    hall working on IBM's first FORTRAN compiler.  He had a great deal
    of trouble convincing ANYONE of why they might want to use a high-level
    language translator."

	db
87.11I like smorgasboardsDSSDEV::HEALYWed Apr 09 1986 20:3719
    I have only been coding for a couple of years and already my greatest 
    quandry is that I like and dislike all of the languages I've used.  I 
    find that no matter what the language, there is always some features
    I'd like to include from some of the other languages I've used.  
    
    Now I almost don't care what I use so long as it gets the job done as 
    painlessly as possible. 
    
    The only language I revolt on is c.  I think it was a great language
    for it's time and should have died shortly there after.  I can code
    in it and read it, but it is always an insulting experience.  I'd
    rather code in macro than to code through the gobbledy-gook - just
    give me the entry points for the libraries and I'll do it myself
    thanks.  Transportablility didn't have to mean c.  Ditto for Unix.
    
    Joe H. (currently coding bliss)
    
    
    
87.12$.02AIAIO::REPSTADThu Apr 10 1986 11:4555
    
    Programmers have a tendancy to be highly subjective when it comes
    to picking one language over another. And most people like the 
    language the learned first the best. 
    
    My pet flame regarding languages is that the nature of the 
    application should drive the selection of the language. Lets 
    face it, nobody would want to write a device driver in cobol, nor
    would they write a business application in Macro. 
    
    All of the languages which have been created to date have specific
    areas in which they excel. I personally like fortran for doing
    quick little hacks. The language is terse and doesn't require 
    a ton of declarations up front. 
    
    As far as the business environment that does everything in COBOL
    because thats the only language their programmers know, I don't
    think that is all that representative. Most programmers know
    multiple languages and are fluent in more than one. 
    
    One of the things that grabs me about the new generation of 
    languages appearing is that they are all so similar in their
    capabilities ("the syntax has been changed to protect the innocent,
    and confuse the experienced"). 
    
    Lets face it guys & gals, it all boils down to ones and zeros.
    The major difficulty I have with learning a new language is the
    syntax. (is there any real difference in between a fortran do loop
    and a c do loop other than syntax?)
    
    The only thing that makes a real difference between languages is
    that somebody thought to include certain functions into their 
    new language, that the older languages don't have. Us programmers
    would have to write libraries of macro routines to be called to
    implement those desired functions that were not a part of the 
    language.
    
    I've recently gone through the trauma of learning Lisp. An interesting
    little language, but I bet some clever hacker could write a
    Lisp interperter on the vax using Lib$Tparse and any combination
    of other languages.
    
    Does anybody agree with any of this? Does any of it make any sense?
    
    What we need are fewer languages, not more. Ada was the grand attempt
    by the military to create the "best, all purpose, standard language
    the whole world should use." It doesn't seem to have turned out
    that way.
    
    Oh well....enough of this drivel...reply's anyone?
    
    
    				--Tom
    
    
87.13TLE::BLICKSTEINDaveThu Apr 10 1986 16:2635
>    Lets face it guys & gals, it all boils down to ones and zeros.
>    The major difficulty I have with learning a new language is the
>    syntax. (is there any real difference in between a fortran do loop
>    and a c do loop other than syntax?)

    I agree with this only in reference to the various "FORGOL" languages,
    which in my opinion tend to be variations, one-pluses, combinations,
    and derivatives of the same general idea (i.e. "FORGOL").
    
    In a broader scope, I disagree with your statements.  I think that
    languages like LISP, APL, SETL, etc. cause (good) programmers to
    model the problem and the solution in different ways, whereas
    the "FORGOL" languages are almost interchangeable in terms of program
    design.
    
    I also long ago abandoned the idea of one language for everything.
    Specialized languages allow more efficient ways of specifying what
    is being done.  In this respect, I think Tony Hoare's statement about
    APL being a "mistake carried through to perfection" might apply
    more to Ada than APL, the "mistake" being the one-language-for-all idea.
    
    It is easier for me to understand a string-manipulation algorithm
    specified in SNOBOL than in PL/I, Ada, Pascal, (FORGOL languages
    with decent string data-types).  Similarly, it is easier for me
    to understand an matrix-oriented algorithm specified in APL.
    
    The counter-argument of having to know several langauges is valid,
    but you have to consider all the various factors.  For example,
    you have to weigh the advantages of coding a string-manipulation
    package in SNOBOL (easier to understand, faster to implement, 
    more terse == less bugs,
    etc) against the disadvantage of requiring people to learn SNOBOL.
    You have to admit that there are cases that favor each side.
    
	db    
87.14BACH::SAVIGNANOStephen Savignano - Lisp DevoThu Apr 10 1986 16:3039
    I perfer to program in Lisp because:
    
    - it has very little sintax ()
        no worse than BEGIN, END, and liberal sprinklings of 
        the all important ";"
    - easy to write quick little hacks
    - NO declarations are needed 
    	 (they let the compiler generate better code but there not
          required)
    - interpreted as well as compiled 
        makes debugging easier
    - VERY rich programming environment
       debugger, stepper, tracer, (original lang. sensitive editor)
    - extremely modular
    
      etc .....
    
    re. -1;
    
    Learning Lisp NEED NOT be a tramatic experience.  I have taught
    many people Lisp.  
    
    Although most find it a little strange at first, I find that the 
    people that have the most trouble are the ones that have LOTS of 
    experience with another language.  They tend to fight the language
    rather that learn it.
    
    
    -Steve
    
  
    Also, Lisp is easly extended.  The Lisp community is quick to 
    recognize the good features of other languages, and incorporate
    them into lisp (sequence capability of APL, packages form ADA....).
    
    
    
  
    
87.15BACH::VANROGGENThu Apr 10 1986 16:4710
    Re: .14
    Actually, packages were in lisp long before Ada was thought of,
    and they mean different things anyway. Besides, Ada doesn't have
    a reader.
    
    Common Lisp did "steal" some sequence functions from APL, though
    it wasn't deliberate. I think the ideas filtered into various lisps
    over the years, and were better organized in Common Lisp.
    
    
87.16Ada is a trademark of the US Govt., AJPOTLE::BRETTThu Apr 10 1986 22:5562
                  <<< TLE::PUBD$:[VAXNOTES]LANGUAGES.NOTE;1 >>>
                               -<   Languages   >-
================================================================================
Note 87.16          Why do people like different languages?             16 of 16
TLE::BRETT                                           53 lines  10-APR-1986 21:51
               -< "Ada" is a trade mark of the US Govt.  AJPO. >-
--------------------------------------------------------------------------------

    
    I believe Dave's FORGOL is a gross underestimation of the fundamental
    differences between a language such as FORTRAN and one such as the
    Ada language.
    
    First though, I'd like to clear up a misunderstanding taking place
    here.  "Ada" is NOT the name of a language, it is a trademark which
    is being applied to a range of tools and activities related to software
    development, including a language, a toolset, an educational program,
    and various research programs such as STARS.   The two biggest s/w
    components after the language translator are the CAIS and the APSE,
    and in many ways these will probably end up dwarfing the translator,
    just as VMS dwarfs any one of our compilers.   Hence the phrase
    "the Ada language" means "the language of the Ada s/w development
    environment".
    
    The general difference between a language such as FORTRAN, LISP,
    or APL and the Ada language is its intended use.  The earlier three
    are basically implementation languages.  You express your design
    and specifications in some other form, and then hack away coding
    it in YFL (your favorite language).  Within an Ada system however
    you express much of your high level design in the Ada language,
    using the translator to check its consistency, before starting on
    the implementation.  This is one of the reasons that the cross-unit
    checks are so important, and why we worry so much about LARGE programs.
    
    APL is basically a single user/small project language, as are many of
    the other languages before the Ada language.  Can you imagine programs
    in APL 100,000 or 10,000,000 lines long?  For that matter, how come
    VAX APL isn't implemented in APL?
    
    As for the argument about cute little overstruck boxes to create
    a million different operators, hence making it more readable, I
    would suggest that those have contributed directly to APL's lack
    of widespread support - most people don't have the h/w to use it.
    
    Yes, each language has its place, but for the kind of s/w being
    developed within DIGITAL I would say that 90% of the time the Ada
    language would be the most appropriate language, 9% of the time BLISS,
    and 1% the rest.  It the short term the lack of user education and
    (for non-VAX) lack of good Ada translators is a problem, but I sure
    hope the status changes in the future!
    
    FORTRAN is here to stay for quite some time, for only two reasons. 
    (1) Customers can't afford to rewrite their s/w,
    (2) hence we, and other manufacturers, pour lots of energy into
    	making it our most optimizing compiler.
    I can't help but feel that, in the long run, we are doing a disservice
    to computing science in general with this approach, but then we
    really are in this for the $$$
    
    
    /Bevin
    
87.17TLE::BLICKSTEINDaveFri Apr 11 1986 09:5957
   Bevin,
    
    My lumping of Ada in with FORGOL is hinged mainly on the premise
    that one models problems (in terms of algorithms and general
    data representations) pretty much identically for all those languages.
    All you've said is that the specification of the model is more
    automated in Ada, and I won't (for the time being) disagree with
    that.

    I believe Dave's FORGOL is a gross underestimation of the fundamental
    differences between a language such as FORTRAN and one such as the
    Ada language.  If the Ada language isn't in the same species as
    the FORGAL languages, well, then it's in the same phylum, whereas
    I feel that non-procedural langauges are in a different kingdom.
    
    > Can you imagine programs in APL 100,000 or 10,000,000 lines long?
    
    Can you imagine an Ada program in 1,000,000,000 lines long?  Applying
    a published Pascal lines per APL lines figure to Ada, that would
    be the equivalent Ada program.  I suspect that Ada is signficantly
    LESS terse than Pascal.
    
    >For that matter, how come VAX APL isn't implemented in APL?
    
    How come VAX Ada isn't implemented in VAX Ada? (You could have used
    the "breadboard" compiler.)
    
    (Begin non-serious section)
    
    Here is a VAX APL interpreter written in APL:
    
    		.BX
    
    Would you care to provide me with a VAX Ada interpreter written
    in Ada?
    
    (End non-serious section)
    
>    As for the argument about cute little overstruck boxes to create
>    a million different operators, hence making it more readable, I
>    would suggest that those have contributed directly to APL's lack
>    of widespread support - most people don't have the h/w to use it.

    Boy are you right.  An interesting side note.  For years, DEC APL
    sales suffered because DEC did not offer an APL terminal, whereas
    other vendors had APL options for their terminals.  A recent release
    of VAX APL goes beyond solving the problem of not having a terminal:
    
    	We went from not having a special APL terminal to not requiring
        one.
    
    With other vendors, you could not introduce APL into a shop without
    convincing them to replace (or upgrade) their terminals.  With VAX
    APL, any VT-220 compatable terminal will do.
        
    
	db
87.18what are languages good for?BACH::VANROGGENFri Apr 11 1986 10:295
    I'd be interested to know what you think the 9% Bliss applications
    and the 1% "other" applications are. That's really the issue in
    this note, not matters of personal preferences, despite the title.
    			---Walter
     
87.19Opening another can of worms...AIAIO::REPSTADFri Apr 11 1986 12:0751
    
    I admit it. I'm biased towards some of the older languages (like
    Fortran & Macro). The only "new" language that I've really been
    impressed with is C. And thats because it's so similar to 
    working with assembly language and a good macro library (C as a
    language is almost useless without a good set of libraries). 
    
    What I would really like to see is a "Universal Compiler", one that
    would allow a programmer to mix languages in a single progam unit
    to take advantage of the best of both worlds. We come close to 
    this with being able to link object modules of different languages
    together. I believe that development of such a compiler would
    be better than trying to add features to a language that the
    syntax and structure of the language simply won't support
    in a reasonable manner. Such a compiler would greatly enhance the
    concept of using the "optimum language" for a particular application.
    
    I believe most applications could directly benefit from using
    multiple languages in the development. The use of a universal
    compiler would ease the problems associated with multiple languages
    being used in the development of an application. 
    
    What kind of problems would arise if we actually had a universal
    compiler. Would it be difficult for us humans to read the source
    code? Would it be a boon or a boondoggle? Food for thought.
    
    Eventually we'll all be out of a job anyway. Users will be able
    to program the computer via direct thought translation (a MIT(?)
    experiment has already succeeded in the recognition of thought
    patterns by a computer!). And we'll have tought computers to 
    program themselves, and build better computers. 
    
    Random Thoughts:
    
    Is programming an art or a science? (I like to think of it as an
    artform).
    
    Is the ability to program a talent or can anybody learn it (I'm
    not talking about the mickey-mouse programs you have to write in
    school, I'm talking about serious programming efforts). 
    
    If you can teach anybody to program, would there be a significant
    difference between the programs written by the "talented" individual
    and those written by the person who memorized the syntax?
    
    
    				Philosophically Yours,
    
    				Tom Repstad
    
    
87.20Your thoughts are safeTLE::BISHOPBFri Apr 11 1986 19:4616
    "Anyone can program" just like "anyone can write", or 
    "anyone can speak".
    
    There's a world of difference between the output of a trained
    person and your average Joe.  There is also a difference between
    a person with something to say and one whose mind is empty or
    whose thoughts are disordered (George Will is an example of a man
    who can write, but can't think).
    
    As for "thought programming": I doubt it.  Machine understanding
    of written language is in the experimental stage, and has been for
    a long time; understanding of speach is at the word stage ditto;
    understanding of thoughts well enough to write code from isn't going
    to happen for a long time.
    
    			-John Bishop
87.21Reading Minds...AIAIO::REPSTADFri Apr 11 1986 20:4319
    The research I was referring to was able to translate a vocabulary
    of about 20 words. The translation was based on the "thought waves"
    generated by the brain when articulating the muscles required to
    say a particular word, wether the word was spoken aloud or not.
    (or something to that effect). The amazing part was that they
    took a group of people who spoke different languages, and taught
    them to speak the english words. There was no difficulty for the
    program in it's ability to distinguish the words regardless of the
    accent of the speaker (thinker?), I bet the guys doing speech
    recognition would like to be able to claim that!
    
    At any rate, I don't believe that thought recognition systems are
    all that far off. Perhaps in the next 10 to 20 years a "practical"
    system may evolve. I believe I originally read the article in Omni
    or a similar magazine. If I can dig it up I'll post it somewhere
    if anybody is interested in it.
    
    					--Tom
    
87.22A proposal to rebuild the Tower of Babble.CSTVAX::MCLUREDavid McLureSat Apr 12 1986 04:1783
	I agree with 87.19 on the concept of a "Universal Compiler".  I
    think that it could happen (and even be used - a mirical in itself),
    if there was an easy way to then decide which languages would be used
    for particular functionality.  If all languages could be categorized into
    a standard set of functional domains, various language clusters could then
    be configured depending upon the needs of the application.

	To cluster languages, we could construct a matrix of languages vs.
    functionality.  If a list of different domains of functionality could
    first be agreed-upon, then it would be possible to map each language to
    the various domains of functionality.  The resulting matrix could be
    used in deciding what language cluster groups to use for a given project.
    
	To begin the process of creating different domains of functionality,
    I will list a few key areas which will hopefully not overlap too much.
    The only way this will work is if everyone contributes, so please take
    this list and add to (or subtract from) it.


	Domains of Programming Functionality
    ------------------------------------------------------
	(A)	Data-type structuring.
	(B)	Parallel processing.
	(C)	User-input handling.
	(D)	String manipulation.
	(E)	List processing.
	(F)	Database management.
	(G)	Mathmatical calculation.
	(H)	Visual presentation.
	(I)	Networking/communications.
	(J)	Etc.

	We can rate the languages (say from 1 to 10) on each of the domains
    by using the matrix diagram shown below.  If you don't see your language(s)
    listed here, then feel free to add them (in alphabetical order).
   
    Note:  Don't rate anything until the above domain list has been established.

    				Functional Domains
    	   +===+===+===+===+===+===+===+===+===+===+===+===+===+===+===+===+===+
Languages  | A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q |
===========+===+===+===+===+===+===+===+===+===+===+===+===+===+===+===+===+===+
ADA	   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |
-----------+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+
ALGOL	   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |
-----------+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+
APL	   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |
-----------+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+
BASIC	   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |
-----------+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+
BLISS	   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |
-----------+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+
C	   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |
-----------+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+
COBOL	   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |
-----------+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+
DESIGN	   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |
-----------+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+
FORTRAN	   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |
-----------+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+
LISP	   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |
-----------+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+
LOGO	   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |
-----------+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+
MACRO-11   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |
-----------+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+
PASCAL	   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |
-----------+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+
(etc.)	   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |
-----------+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+
(etc.)	   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |
-----------+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+---+
(etc.)	   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |   |
===========+===+===+===+===+===+===+===+===+===+===+===+===+===+===+===+===+===+


	Like I said, the first task is to settle upon the different domains
    of functionality.  Once that is accomplished, then a matrix model such as
    this could be used to record average ratings given for each language by
    reader consensus.

					-DAV0
87.23Ada and "We are the world" compilersDSSDEV::HEALYSat Apr 12 1986 15:3815
    I'd like to hear the 9%/1% answer myself, but I basically agree
    with Bevin that Ada should be the language of choice for development.
    
    It provides (or allows you to implement) most of the desirable features
    of the various "FOGOL" languages.  As for a "universal compiler" -
    I don't know...  
    
    I come from a software services background and such an item may be a 
    desirable commodity because customers are scared to death of
    multi-language projects.  All but the most sophisticated customers
    visibly twitch at the mere thought of getting involved with such
    an endeavor.
    
    jeh
    
87.24TLE::BRETTSat Apr 12 1986 18:0314
    
    9% BLISS for cases where the available Ada compiler is not yet capable
    of producing code that meets the requirements.  For example, fork-level
    stuff in VMS mustn't use registers other than R0..R%, PC, etc. 
    At the same time the Ada compiler should be enhanced to meet these
    needs, and (in the future) OS's should be designed to not have them.
    
    1% MACRO when the exact machine instructions are important - eg:
    the save/load process context instructions.  Actually 1% is probably
    a GROSS overestimate of the amount of such stuff, and you could
    probably save developers time by going back to writing machine code
    and not have to develop an assembler.
    
    /Bevin
87.25SMOP::GLOSSOPKent GlossopSat Apr 12 1986 18:4978
I'm sorry to see Ada turning into a religion.  I thought that we were past
that point.  (Judging from this dicussion, I was seriously mistaken.)  In
any case, I don't see Ada as the final word in programming languages any
more than PL/I or ALGOL-68 was.  Each (including Ada) was made of what was
considered to be "superior" technology of the day.  I expect to see a new
"3rd generation" (FORGOL :-) ) language in 10-15 years that will again be
"the superior language".  Technogoly (even software technology...)
is continuing to advance, and I have little doubt that down the road there
will be enough new ideas that Ada (environment and all) will be rendered
as another major, but not the last, language.  (I don't personally consider
Ada to be a leaps-and-bounds advance, merely an incremental refinement of
many ideas that have been around for a long time.  To be sure, it does
have a lot of useful features, particularly when it comes to building large
systems.  However, that doesn't mean that Ada is the last word.)

Languages are tools for implementing ideas, much as physical tools and raw
materials form the basis for building physical items.  We have specialized
our hand tools because it is useful to do so.  While we can continue to
try to build "the ultimate language", I don't think that it is possible to
succeed at such a task.  Why?

    o Different tasks can be done more efficiently with different tools.

    o People use what they've been tought, what they happen to take a
      fancy to, what the code that they are using frequently is already
      written in, or what language meets the above criteria the best of
      the ones available to them.  (Note that DEC is very unusual in that
      people can simply say "I want to use x", and then go off and install
      it on their system.).  There is alot of FORTRAN and COBOL code around
      (not to mention alot of other languages), and there are alot of people
      that know those languages and could care less about Ada or C.

While I can envision many times when a language like Ada would be useful, I
can also think of times when APL or even FORTRAN would be far more appropriate
choices.  For example:

    o If I had an application that was a prime candidate for using
      packed decimal data (say a business application that did alot
      of I/O and relatively little computation), I would not pick Ada
      or C as the implementation language.

    o If I had some array data that I wanted to use for experimenting
      with various computations on, I would use APL or a spread-sheet.

    o If I wanted to write quick program to be used once that did some
      straightforward mathematical computations, I would be inclined
      to use PL/I.  (This is a good example of where style and/or
      familiarity come into play.  Many people would use FORTRAN for
      this.  While I know FORTRAN, I prefer PL/I because it has all of
      the features of FORTRAN in this area and because I currently use
      it for my day-to-day work.)

    o If I were writing an application with a large amount of string
      and pattern processing, I would use SNOBOL, SCAN, or something
      similar.

There are numerous other cases, many of which are much more just a matter
of taste.

As far as a "universal" compiler - you'll never see it.  The closest that
you'll probably ever see is a slightly better integrated version of the
VMS environment that allows multiple language objects to be linked together,
possibly with link-time type checking, although even that would be a minor
nightmare for anything but the most primitive data types.

----

As far as Ada vs. C, it would depend on the task.  For large tasks or ones
that were going to be around for a long time, I would use Ada.  For
bit-twiddling tasks, I would consider C.  In actual practice, however, I
would not pick either one because PL/I is essentially a requirement for
the work I am doing now, and the PL/I language has enough features that
the only other language I have found necessary on occasion was Macro, and
the cases when I needed it could not have been handled by C or Ada, despite
their language features.  (So much for the language to end all languages...)

Kent
(VAX PL/I)
87.26TLE::WINALSKIPaul S. WinalskiSat Apr 12 1986 19:5613
A semi-serious observation:

	A committee of language designers got together in the mid-60s and
	lumped all of the bad programming practices of that decade into one
	language.  The language was called PL/I.

	A committee of language designers got together in the mid-70s and
	lumped all of the bad programming practices of that decade into one
	language.  The language was called Ada.

PL/I, by the way, happens to be the language I most like programming in.

--PSW
87.27TLE::BRETTSun Apr 13 1986 00:2022
    
    In case you missed my rather long reply
    
    
		ADA IS NOT THE NAME OF A COMPUTER LANGUAGE
        
    		IT IS THE NAME OF A LARGE NUMBER OF EFFORTS AIMED
    		AT PROGRAMMING LARGE APPLICATIONS.
    
    		ONE OF THESE EFFORTS WAS THE DESIGN OF A LANGUAGE.
    
    
    Furthermore it is NOT true that the Ada language was designed by
    a committee.  It was designed by a team from CII Honeywell Bull,
    and refined by a dictatorship.  Both the team and in the dictatorship
    were headed by Jean Ichbiah.  A committee of extremely good people
    now form the "Language Maintenance Committee", but that is NOT doing
    language design, but rather clarifying various difficult but not
    critical issues.
    
                    
    /Bevin
87.28The Mr. Popeil approach to computer softwareTLE::BLICKSTEINDaveSun Apr 13 1986 10:3724
    Bevin, why don't you give us some nomenclature for Ada that includes
    an unambiguous term for what you'd expect to find in file with the
    .ADA extension.
    
    My problem with Ada and your "90% statement" can be summarized as
    follows:   I reject the notion that doing everything with
    one tool is the best way to do things (or certainly not for "90%"
    of all cases).  This is a bad idea ("carried through to perfection"
    in Ada) and it is my opinion (and to some extent, observation) that
    this is NOT the direction that software engineering will follow.
    
    I wouldn't want to limit myself to using only one "all-purpose"
    tool to build a house when I can use several special purpose tools
    better suited to individual tasks.   I have to stretch my imagination
    to understand why someone might claim that would even be an "advantage".

    Ada (referring the the "large number of efforts") has collected
    and refined some of the better ideas of the past, and introduced
    a few new good ones, but rather  than collect all these things into
    one enormous complicated heap (the so-called "kitchen sink" approach),
    it would be better to distribute the good ideas from Ada back out
    to other languages where they are appropriate.

    	db
87.29(And now for another biased opinion...:-)BACH::VANROGGENSun Apr 13 1986 17:4724
    Actually, I agree with the notion that most programming tasks can
    and should be done with one language. I happen to think that language
    is Lisp, not Ada or PL/I, but which language isn't so important.
    What's important to the programmer is the richness and flexibility
    of the whole programming environment. When using one language, you
    can take advantage of uniform conventions and better integration
    between all the components of a system. These two reasons are why
    Ada is so comprehensive and why it's a big improvement over older
    languages like PL/I and Fortran.
    
    However, those advantages also apply to Lisp. The important difference
    here is that Lisp, unlike all the other languages I know of, has
    been able to incorporate the good ideas of other languages without
    losing many of its characteristics. This ability will give it a
    much longer useful lifespan, because it won't become obsolete or
    unfashionable. It's been used to write operating systems and to
    create new experimental languages. It usually provides a comprehensive
    programming environment which is easily tailorable. It's great for
    writing quickie little programs or for writing huge systems, since
    it provides lots of predefined capabilities and is adaptable in
    the way it's used.
    
    			---Walter
    
87.30What is Ada; "Tools"?LATOUR::AMARTINAlan H. MartinSun Apr 13 1986 21:3625
.27>ADA IS NOT THE NAME OF A COMPUTER LANGUAGE

Let's see what the Ada standard has to say about this.  From
ANSI/MIL-STD-1815A, page 1-1.

"
1.  Introduction

Ada is a programming language designed in accordance with requirements
defined by the United States Department of Defense: the so-called Steelman
Requirements.
"

Be careful.  When you open your door some dark night there could be
agents from the AJPO outside. . .


Re .28:

I believe the use of the word "tool" in discussing Bevin's percentages
may not aid the interchange of information.  It is hard to believe that
Bevin would have you use Ada instead of Yacc, Ada, MMS or a spreadsheet
in the applications they were invented for.  He confined his answer to
comparison between existing Digital-supplied programming languages on VMS.
				/AHM
87.31TLE::BRETTSun Apr 13 1986 23:377
    
    Yes, it is intriguing that the usage of the word "Ada" has changed
    over the years.  Curiously enough the usage in the ARM is against
    the advised usage of trade marks, which are basically that trade
    marks are damaged by using them as nouns.
    
    /Bevin
87.32Oh! Pascal!CSTVAX::MCLUREDavid McLureMon Apr 14 1986 02:4962
	I have been growing steadily more interested in ADA as this note
    series progresses.  So much so, that today I went to a bookstore to
    see if I could find a good introductory book on it.  I expected to see
    the walls literally lined with books on ADA from the sounds of it in
    this note... as it turned out, I couldn't find one book on ADA in the
    entire store.

	To my surprise, the shelves were lined with my favorite language:
    Pascal.  I found every Pascal book that I own, plus many more.  There
    were also alot of books on Turbo Pascal (which IBM seems to be pushing
    these days for the IBM-PC).  In total I counted around 6 shelves of
    books on Pascal, 1 shelf on Fortran, 2 shelves on Basic, 1 shelf on
    PL/I, half a shelf on C, and half a shelf on Lisp and several other
    micelaneous languages.

	While not a college bookstore, (this was a bookstore in the Lechmere
    mall in Framingham) I did recognize alot of my college textbooks among the
    collection such as "Oh Pascal!" by Cooper and Clancy; "Pascal With Style"
    by Ledgard, Nagin, and Hueras; "Introduction to Pascal" by Welsh and Elder;
    "Lisp" by Winston and Horn; "Data Structures and PL/I Programming" by
    Augenstein and Tenenbaum, and many other familiar non-language (text)books,
    so I would wiegh this sampling as a fair representation.

	I think we have not given Pascal enough credit here as a language of
    choice among the "silent majority" of programmers.  It seems that most
    computer science textbooks use Pascal to explain algorithms and concepts
    and that most computer science curriculums include Pascal as the required
    high-level language (while Engineers are still being taught Fortran).  So,
    what ever happens to all those Pascal programmers?  Do they immediately
    switch to ADA once they leave school and work in the computer industry?
    I know I haven't, although I'll have to admit that I'm curious about ADA.

	Since I have been here at DEC, I have mainly used the Design language
    (of VAX/Producer) which is similar to Pascal.  I find that it is very good
    for string manipulation and list processing, so I use it for my main program
    to handle event queues and text output.  Because it is an interpreted lang-
    uage on the Pro-350, I increase the speed of graphics using Gidis files
    spawned to a MACRO-11 task (written by Fred Kleinsorge).  To allow inter-
    active graphics and IVIS commands, I multi-task with a Basic program (which
    was originally written by Phil Perkins and later modified by yours truly)
    to make calls to the CORE graphics library and the Pro IVIS Driver.  To
    make matters more interesting, I've been creating Gidis files using a Pascal
    program because of it's extensive data-type definition capabilities which
    allow for easy 3-D viewpoint transformations, among other things.

	This is an example of a "cluster" of languages (that I mentioned in
    87.22).  I don't see anything too terifying about this sort of an arrange-
    ment.  It allows me to share existing tasks, as well as gain the function-
    ality of other languages.  This configuration is mainly due to the fact
    the current version of Design used in the field (v 1.5) doesn't allow all
    of these functionalities, but the new version (v 1.6) does.  Still, it
    allowed me to include all of the "new" functionality in a program which
    has to run on the existing (v 1.5) systems in the field.

	While I question the need for a "Universal Compiler" (since I had
    no problem compiling each of these programs with their own compilers), I
    still think it would be nice to have a "cluster matrix" (up-front in the
    planning phases) to determine what sort of language cluster to configure
    for a particular application.

						-DAV0
87.33PL/I&Pascal (1965)-> Mil Spec 1815a (1985) ->? (2005)TLE::BRETTMon Apr 14 1986 09:0454
    
    Just as, in twenty years time, there will undoubtably be better
    languages than the Ada language; so, today, the Ada language is
    significantly better for s/w engineering than the other general
    purpose languages (Pascal, Fortran, PL/I) simply because it
    incorporates many more years of experience in its design and wasn't
    intended to be a simple teaching language, unlike Pascal.
    
    The trouble is, then as now, there will be people who will say "why
    not just put all those grovy features in MY favorite language".
    
    The reasons are simple.
    
    (1) Languages should be designed as an integrated collection of
    features. For example, you have to put overloading, operator names,
    packages, and better name selection mechanisms into a language ALL
    AT ONCE to get the benefit of being able to give procedures/functions
    the same identifier.  Miss ANY of these, and the others don't hang
    together well.  As an example of doing the opposite, I would suggest
    FORTRAN 8X.
    
    (2) Earlier decisions in the language (eg: implicit declaration of
    variables in FORTRAN may conflict with later concepts (eg: nested scopes).
                                                            
    I have no objection to specialised tools, Dave, but FORTRAN, PL/I,
    Pascal, and the Ada language are the same GENERAL PURPOSE tool.

    FORTRAN can, with strong justification, claim to be a special purpose
    high speed computing tool and as such has a strong claim to continued
    usage.  The other two will continue to live also, but the reasons
    will be largely historical, and (for Pascal in the educational environment)
    partly because of performance.
    
    
    Bookshops:  The one I frequent in the Mall of New Hampshire has
    a reasonable selection of Ada books.
    
    .ADA files: You will find Ada programs in .ADA files.
    
    
    I was intrigued at a recent discussion on the Ada Usenet recently
    dealing with teaching students programming.  One guy was claiming
    that the Ada language was difficult to teach, then another (both
    from universities, I believe) replied that he found that teaching
    the students good s/w engineering practices (modularising, abstraction,
    information hiding, etc.) and then using the Ada language to follow
    these practises lead the students to being increasing frustrated
    with the more traditional languages because they didn't support
    such practises well.   Furthermore this approach had the added
    advantage that it made design of the Ada language apparent, so the
    impression that it's a collection of unrelated features is shown
    to be false.
    
    /Bevin
87.34B. Dalton's and Ada; Ada for examplesGALLO::AMARTINAlan H. MartinMon Apr 14 1986 10:0827
I am very surprised that the B. Dalton's in Framingham Mall didn't have
any Ada books available, because they always have before.  (The bookstore
in Mall of New Hampshire is also a B. Dalton's, and they both have
collections of about the same size (about 5 large racks, floor-to-ceiling).
Maybe they were sold out (or at least they had not moved new copies
out from storage).  Or perhaps all the Ada books were in the rack marked
Software Engineering.  (Both stores are not above putting books with names
like "Pascal for the 8080" in the section marked "Microcomputers - 8080".)

If you really want to examine a few books on Ada, you are more likely
to see them by asking someone in that bookstore to check the stockroom
than by visiting any other name-brand bookstore chain in the suburbs.
Or check out the contents of a Digital site library before you buy.


It takes a special kind of commitment for an author to use Ada instead
of, say Pascal, for examples of algorithms and the like.  Compare the
size of Bevin's universal sort routine in the Ada conference with a
J. Random algorithm book's example of Quicksort.  The former is ready
for use (I assume), whereas the latter will almost invariably need to have
declarations added or changed before it can be used to actually sort
something you would want to sort.

You can write tiny, special case examples in Ada as well as Pascal,
but you would not get all of the advantages of Ada; indeed, the program
might not be considered to be the ultimate expression of Ada style.
				/AHM
87.35AIAIO::REPSTADMon Apr 14 1986 11:4269
    Lets face it guys & gals, many if not most of us view languages
    in a manner similar to a religion. Once you've been indoctrinated
    in a partiular religion, you will by whatever means necessary try
    to rationalize your beliefs. I would be very interested to read
    a psychological paper on the reasons behind our (i.e. S/W Engineers)
    trial/rutial practice of endearing ourselves to a particular language.
    
    Some interesting topics in a paper would be:
    
    	1. "First Love": The undying devotion to the first language
                         one learns.
    
        2. "Revelation": The trauma and joy at the discovery of yet
                	 another language "better" than "First Love"
       			 (although first love is not forgotten, only
    			 used less frequently).
    
        3. "Rejection":  The practice of dismissing some new language
     			 without the benefit of trying to learn it 
    			 simply because it is "not my kind of language".
    
        4. "Maturity":   The recognition that all languages have merit,
    			 and that it doesn't matter what you program
                   	 in, the end result will be the same. 
    
        5. "Grace":      (Grace= A gift given by God which is undeserved
      			 and/or unearned). 
    
    			 The realization that arguments/discussions
    			 about the "Preferred" language are ephemeral.
    			 Time will change everything.
    
    Personal Note:
    
    	Hold not too dearly to your beliefs about programming languages,
    	For time shall surely rip them asunder.
    
                                          
    I realize that I may be a novice at this programming stuff compared
    to some of the people in this notes file (I've only been at it for
    around 10 years now). But I have difficulty in believing that any
    language is superior in general to anyother. Languages definately
    have qualities which make them superior in certain areas but not
    as a whold. 
    
    Regarding the reply that stated we would never see a universal
    compiler, I beg to differ. I thing the technology of software
    engineering will eventually lead to the necessity of such a compiler.
    I do not believe it will be "structured" as in the matrix in reply
    .22(?) but rather it will be free form. You will be able 
    mix source code line after line (including macro/machine instructions).
    Something as simple as a language identifier character in column-1
    or 999? would make the development of a universal compiler less
    complicated, or we could program in the intelligence to allow the
    compiler to determine the language. It may be required that the
    programmer keep in mind some of intrinsic architecture of the 
    languages being used (i.e. argument passing mechanisms, etc), but
    even that could be eliminated or seriously reduced. 
    
    Remember gang, never say never! We are as a race achieving 
    phenomenal technological growth (if only we could grow as 
    fast culturaly), I don't believe anything is impossible given 
    sufficient time and resources.
    
    
    			:-) Tom
    
    
    
87.36Shoping for LanguagesCSTVAX::MCLUREDavid McLureMon Apr 14 1986 11:5331
	I'm sorry If I gave the impression of wanting "all the groovy features
    put into my favorite language".  This has apparently already been done with
    ADA.

>    The trouble is, then as now, there will be people who will say "why
>    not just put all those grovy features in MY favorite language".
    
	On the contrary, I am proposing the ability to easily cluster existing
    code written in a wide variety of languages for a particular application.
    It would be nice to offer a menu of different language modules arranged
    according to functionality (as opposed to the current practice of arranging
    modules according to language only), which a programmer could select from
    in designing a particular application.  Depending upon the functionality
    needed, the programmer could add these pre-existing modules written in
    other languages to enhance his or her "favorite language" for a specific
    application.

	I think the trend (for PC's at least) is going towards application
    specific needs and away from all-purpose application environments.  Let's
    face it, the client doesn't usually care how modular or cohesive the code
    is, as long as it can do what they wanted when they wanted it (and fast).
    Besides, how available is ADA on a Pro (for example)?

	Try this analogy on for size:  if languages were stores, ADA might
    be viewed as offering the all-purpose shopping needs of Sears (for example),
    while C might offer the local goods and services of a hardware store such
    as Sommerville Lumber.  To tie this to my cluster proposal, I would say
    that I am trying to construct a "Mall" of different languages all under
    one roof.

					-DAV0
87.37Coke is it!RACHEL::BARABASHBill BarabashMon Apr 14 1986 15:1711
RE: .31

>   Curiously enough the usage in the ARM is against
>   the advised usage of trade marks, which are basically that trade
>   marks are damaged by using them as nouns.
    
  The restriction against using trademarks as nouns does not apply to the
  use of *registered* trademarks.  (As far as I know DEC has never registered
  any of its trademarks.)

  -- Bill B.
87.38On religion and software engineeringMLOKAI::MACKIt&#039;s the real world after allSat Apr 19 1986 22:1754
    An observation on the "one or many" argument going on here.  
    
    One of the responses referred to the "which language" issue as
    "religious".  I agree with the simile, but not with the implication
    that it can be dismissed purely a matter of personal taste.
    
    It is important to understand religions.  A religion is not a set of
    unconnected irrational convictions, but a set of well-connected
    convictions based on a set of premises, which may be right or false,
    complete or incomplete.  
    
    The Ada language (like that, Bevin? :-)) is (perhaps) the first
    language ever founded on a software engineering process model. Ada is
    primarily concerned with "what is the best way to engineer a software
    product.  Ada will have both the strengths and weaknesses of the model
    it uses for the software engineering process. 
    
    What makes Ada different is that it is the first language to take this
    approach.  The others were primarily concerned with "what is the best
    way to code a module for {formula translation, business, education,
    etc.}" 
    
    In the typical software engineering process model, "coding" refers to a
    single activity.  Many replies to this note suggest that different
    parts of systems are so different that they are best done by completely
    different tools and perhaps in different ways.  A parallel engineering
    process would crudely lump all activities in building a house under the
    label "construction". 
    
    This may be true.  If so, the next generation of software engineering
    environments may reflect the more complex model.  Perhaps job titles
    will include specialties like "human interface engineer" or "multi-
    tasking engineer", with tools tailored to those roles.  
    
    Today, if the "art" being practiced is software engineering, not pro-
    gramming or computer science, Ada represents the "conservative state of
    the art" language. Things like Owl may represent the "aggressive state
    of the art" language.  Other languages can be used very successfully
    with a software engineering environment, but aren't *tailored* to it. 

    Religions often differ most in how people define themselves.
    That last paragraph contained an explosive issue (eng. vs programming
    vs science) that may be at the root of this matter.  Since it forms a
    digression, I shall start a separate note on it.  (I couldn't spell
    "separate" before I learned Ada! :-) ) 
    
    To continue this discussion, how would you like to see the single block
    in the phase process called "Coding" divided up?  What seperate
    activities does it entail?  (Note that I refer specifically to "coding"
    activities, not unit test or detail design, although a particular
    activity may engulf these as well.)  What *different* activities are
    today collected under the title "writing code"? 
    
    							Ralph
87.39Humorous interludeTILLER::SEARSTue Apr 22 1986 11:3525
    This note and the interesting replys remind me of an experience:
    
    I was at a Boston VAX DECUS Local User's Group (LUG) sitting in
    on a meeting on day.
    
    One must be aware the Boston VAX LUG derrives its membership from
    MIT, Harvard and all the local hi-tech bisinesses, and is probably
    the strongest in the country with some very competent people who
    have some very strong opinions. Some of their strongest opinions
    are reserved for the standard religious issues such as operating
    system choice and computer language.
    
    Well, we were in the question and answer period before the formal
    presentation when every one pops up and seeks help or
    presents their latest VMS Vn.+1 hints. 
    
    It was at this time a (obviously neophite) user got up and said:
    
    "We just ordered a VAX and are curious to see if anyone has an opinion
    as to if we should use VAX/VMS or UNIX for our operating system?"
                                                      
    the formal presentation got started that evening...
    
    - paul
87.40Analogies causing anomalies...CSTVAX::MCLUREDavid McLureTue Apr 22 1986 13:3230
    	Analogies can be useful tools in a disscussion when they help
    establish a familiar conceptual model.  Unfortunately, I can't help
    but recall an old proverb "Never discuss Religion or Politics...",
    I'm afraid that applies here as well.  I prefer to use something more
    non-committal such as comparing computer languages with actual human
    languages (i.e. English vs. French).
    
    	In order to address the recent questions raised by the originator
    of this topic (in .38), that being the activities collected under the
    title "writing code", I would break it down into three categories:

    	(1)	Converting design metacode to pseudocode.

    	(2)	Expansion of pseudocode to compilable modules.

    	(3)	Additions and tweaks to debugged & tested modules.


    	This was too easy, and I'm sure I've probably left out something
    here, but I think that actual coding is easy compared to all of the
    other elements of the design cycle (as long as all the other elements
    of the design cycle are performed - it's when they aren't performed
    that coding becomes an impossibility).

    	Maybe the real issue is "What are those other elements, and how
    do they influence the choice of a language or how do languages help to
    integrate the other elements into the design cycle?"

    						-DAV0
87.41I should have been clearer...MLOKAI::MACKIt&#039;s the real world after allWed Apr 23 1986 16:3133
    Clarification #1:
    
    In discussing coding, I was considering pseudo-code as a part of
    detail design (which of course points out that nobody quite agrees
    where the magic boundaries lie).
    
    Clarification #2:
    
    In asking what things usually end up lumped under coding, I wasn't
    thinking of "things in sequence" (phase steps) as much as "things in
    parallel", i.e. what kinds of coding are so different as to suggest the
    use of different sets of tools and skills, analogous to the difference
    between plumbing and carpentry?  
    
    (.40)
    > Maybe the real issue is "What are those other elements, and how
    > do they influence the choice of a language or how do languages
    > help to integrate the other elements into the design cycle?"
    
    Perhaps.  Maybe a yet more complex model (umph!) is needed where
    the whole design, coding, and test phases are divided into parallel
    branches for the different kinds of "programming" involved.  (This
    gets to look more and more like a construction project!)
    
    So, somewhat broader this time, the question:  "What kinds of software
    within a single project require such different tools and skills for
    their design, development, and/or testing, that they must realistically
    be classified as different kinds of activity?"
    
    I'll start it rolling with a few candidates:  Conventional, database, 
    traditional AI, and expert systems.  Comments?

        					Ralph
87.42A week later...still thinkingCSTVAX::MCLUREDavid McLureTue Apr 29 1986 15:1325
    
>    So, somewhat broader this time, the question:  "What kinds of software
>    within a single project require such different tools and skills for
>    their design, development, and/or testing, that they must realistically
>    be classified as different kinds of activity?"
    

	As far as the different activities involved in writing code, the
    first thing to do is start with an independant unit or task.  Then, I
    usually look for tasks which seem to be repetitive and create a loop
    around them.  Once inside a loop, the various states can be further
    divided into some kind of a case statement.  Within each case choice,
    I either use an if-then-else structure (for small stuff), or a call
    to a sub-unit which breaks it down further.

	At the bottom of the tree hierarchy, you typically see less loops
    and more case statements (i.e. less flow and more assingment statements).
    My typical algorithms involve a main loop which has an initialization
    segment before, and an exit segment afterwards, with a series of tests
    in the middle which usually call sub-units.  At the far corners of the
    program tree, reside sets of data-pockets and specific functions which
    are accessed as needed.

						-DAV0

87.43Step out of the lab for a minute.APTECH::RSTONEWed Apr 30 1986 12:5144
I just stumbled into this notes conference and have become intrigued by this
topic.  However, I feel like an outsider eavesdropping on a discussion being
conducted by members of some elite clique.

For a quick frame of reference, I have been "programming" for approximately
28 years now, and do have a depth of experience that goes waaaay back.  But
what I perceive in this topic is analagous to one or more of the following:

   o  A group of truck drivers discussing the merits of a Peterbuilt vs
      a Kenilworth vs a Mack, etc.

   o  A group of sports car enthusiasts discussing the various attributes
      of their toys (speed, style, stability, etc).

   o  A group of pilots discussing ...whatever.

If you can stop for a minute and realize that "programming" involves getting
something (a concept) from here (the conceiver) to there (an operating 
computerized application).  The programming language is the medium by which
this is accomplished.

Now think of the process of physically transporting "things" from here to 
there....the "best way" is subject to all kinds of criteria: time, size, weight,
cost, convenience, familiarity, personal preference based on past experience
(both positive and negative), etc. etc.  The possibilities encompass a huge
range of posibilities:  Macro transportation...ships, rail, air-freight,
trailer trucks; Intermediate transportation....ferry boats, subways, city buses,
commuter flights; Small-scale transportation....private boats, autos, light
planes....for either business or recreation.

Obviously you would not use a freight train to distribute parcels to homes in 
the suburbs, nor would you load thousands of people into taxi's to get them
from city to city.

So back to the point...you choose a language which is appropriate for the job,
gets the job done in a reasonable time and at a reasonable expense.  If 
the boss says, "Take the company car," you don't have much choice.  If he 
says, "Take a vehicle from the motor pool," then you can decide whether you
would prefer a Cadillac limousine or a pickup truck or something in between. 

Debating the merits of one make of vehicle against another is (as stated
earlier) a religious approach.  "Personal" preference is just that.  The 
requirements of the task is something else, and no amount of debate will
bring any real consensus on the issue.
87.44yes, but...BACH::VANROGGENWed Apr 30 1986 22:054
    I think most readers (and writers) will agree with your main point; 
    but I don't think consensus was expected or even desirable.
    It's more entertaining and enlightening to disagree.
    			---Walter
87.45Oh, I disagree :-)ENGINE::BUEHLERJohn Buehler, Maynard MAFri May 02 1986 09:570
87.46Let's try a new directionMLOKAI::MACKIt&#039;s the real world after allSun May 04 1986 19:4159
    Re .44:  I also disagree.  Consensus is valuable.  
    
    There isn't likely to be a consensus here about what the "right"
    language is, because I suspect we have different goals in doing
    (planning, designing, coding, whatever) programs, but I am also
    persuaded that it is no matter of taste either. What I am hoping will
    form is a mapping that we can agree upon between the way a person sees
    his task (developing programs), the nature of the task itself, and the
    tool he uses for the task. 
    
    I don't think there is enough evidence in the field yet to define with
    any certainty the "right" approach to programming, but each of us has
    some approach or attitude toward programming.  Perhaps the best direc-
    tion in which to lead this discussion is to ask the participants to
    describe their own biases and identify what elements of their favorite
    language help them with that. 
    
    To lead the discussion in this direction, I will offer my own biases:
    
        I have been working now for four years in an environment that has
        had no formal approach to the development of software. Each of us
        has pretty much home-grown his own techniques.  This has worked
        well for us on individual projects, but has been a disaster for
        group projects.  We are now gearing up for using a down-scaled
        version of the familiar-to-most phase review process.
        
        In return for the advantages of having real specs and adhering to
        them, though, we take on the disadvantages of a certain amount of
        drudgery.  In the heat of development, it is easy to say, "Awww,
        the heck with it..." and go back to the meandering approach that is
        so familiar, and, in the short run, easier. 
    
        Ada helps with this because (a) it supports the software engin-
        eering process with strong data typing and the possibility of
        implementing abstract data types, and (b) it makes it fairly
        inconvenient to go back to the meandering, "maybe I'll design
        that part after I code this one" approach. 
    
        A software project in Ada using a phased approach is easier to
        "share out".  You can hand people package specifications, and they
        can first implement a stub to go behind that spec. As they imple-
        ment the body of the package, they can slowly replace the stubs
        with real code "transparently" to the other modules using that
        package.  Thus more work can go on concurrently, with less
        communication needed between the people involved. 
    
        Ada's bigness doesn't particularly bother me.  CPU time is cheap
        and getting cheaper, and it won't be long before software devel-
        opers have a VAX under their desk with a reasonably spacious hard
        disk and a NI-Cluster connection as a matter of course anyway. 
        
        Is Ada the "best" language?  Who knows?  It seems to offer certain
        advantages for the phased development process.  Tasking as an
        inherent language feature may become a benefit if symmetrical
        multi-tasking becomes available.
        
    Whose turn is it next?
        
    						Ralph
87.47You can't teach new dogs old tricksCSTVAX::MCLUREDavid McLureMon May 05 1986 03:4551
    	I can relate to the meandering approach (re. -1), because for the
    last two years, I have been the only programmer/analyst - software engineer
    - software specialist - kitchen sink installer - etc. in my whole group.
    Any sort of structure I came up with was basically done for myself and the
    great computer god(s).

    	As it turned out, I'm glad to have been able to have used a structured
    language (Digital's own DESIGN language which is part of the VAX/PRODUCER
    package used in CBI and IVIS programs).  Similar to PASCAL and/or ADA,
    it allows a very modular structuring which would (conceivably) lend itself
    well to parellel work in progress (re. -1).

    	Unfortunately, those were the days.  Since then, IVIS has had to
    return to the future (from whence it came), and I have been traded to a
    manager who requires that everyone (all two of us) use VAX BASIC for
    everything (and I mean everything) because it's his "favorite language".

    	My main reason for participating in this particular notesfile so
    heavily is two-fold:

    	(1)	I'm trying to talk myself into using VAX BASIC and not having
    		to find a new job which doesn't require coding in it.

    	(2)	I'm looking for ammo to use to convince my manager that 
    		maybe VAX BASIC isn't the best language to use to standardize
    		all coding in for the group.

    	During the past few weeks of soul-searching, I have discovered that:

    	(1)	VAX BASIC is more structured than most BASIC implementations.

    	(2)	My manager isn't budging an inch from his VAX BASIC requirement.

    	(3)	VAX BASIC is used (to my surprise) very widely within DEC and
    		I'm having trouble finding job openings which use anything
    		other than VAX BASIC and/or COBOL.

    	(4)	I would still much rather use ADA, DESIGN, PASCAL, C, LISP,
    		OPS-5, DATATRIEVE, BLISS, FORTRAN-77, or PL-1 over any version
		of BASIC.  I obviously haven't succeeded in talking myself
		into using VAX BASIC yet - other than to keep my manager happy.


	The one good thing I've been able to say about VAX BASIC is: "as long
    as BASIC is around, no-one will have to worry about programming themselves
    out of a job".  However, I sometimes wonder about the future of a company
    which seems slow to allow software development to evolve into using more
    structured languages.

    						-DAV0
87.48Some ideas for youLATOUR::AMARTINAlan H. MartinMon May 05 1986 12:4732
Re .47:

Whether VAX BASIC should be in the running depends on what your group
(or just you) are supposed to be doing.  What kind of problems are your
programs trying to solve.

An angle you might consider is to learn as much as you can about VAX
BASIC, and to use it in the best way you can devise.  This means giving
your job your best shot.  It should hopefully result in some positive
results.  If your code is noticeably more readable, maintainable and
better structured, it will hopefully become more reliable than the group's
norm.  That shouldn't hurt your reputation any.  And if there are
shortcomings to doing quality work in VAX BASIC (I wouldn't know, I
haven't used Basic seriously since high school, and have never learned
about this product's features), then really striving for perfection
ought to make it obvious what they are.  Then you will be able to honestly
point them out and say "If we were using Language X, this bug would
have been caught by strong typing", or "I wouldn't have to code around
the lack of support for feature Y if we were using Language X".  Having
solid facts to back up your opinion doesn't hurt either.

Also, you ought to be entitled to take internal programming courses,
even those that prepare you for your next job, rather than your current
one.  If you take a course that teaches a better direction than your
group is taking, you can't be blamed for sharing what you have learned.
(And you might make a contact with someone that has an open req that
is a better fit than your current assignment).

I hope the above is more constructive than mere advice on how to be
a "rabble-rouser".  I could certainly advise you on how to get yourself
into trouble, I hope I have not merely done that.
				/AHM
87.49Language training for Sales Training?CSTVAX::MCLUREDavid McLureMon May 05 1986 15:1438
	Thanks for the advice (re. -1).  I will give VAX BASIC my best shot-
    as always.  As far as "rabble-rousing" goes, if you'll notice the time
    stamp on note 87.47, I was up pretty late last night banging my head
    against the proverbial wall and this was the result.

	As to training, this is an interesting subject since (when it comes
    to learning new languages) the only training I have been able get approved
    since I have been at Digital (over 2 years now), is a CBI course account
    for the VAX/PRODUCER (DESIGN) language.  This has had some interesting
    "sink or swim" consequences as I have since mastered DESIGN, and gone
    on to investigate many other languages on my own as well.

	First I got ahold of the MACRO-11 student guide last summer, last fall
    it was LISP, this winter I decided to dust off my PASCAL books, and now
    I just purchased the official goverment version of the ADA reference
    manual (for a surprisingly low $15.00 at the M.I.T. Tech-Coop) as well as
    some programming reference cards for the C language; all of which was
    largely inspired by this notesfile.  I'm also trying to coordinate this
    with an associating graphics language interface (I consider computer
    graphics protocols to be languages as well) and also recently purchased
    the "Introduction to the Graphical Kernal System G-K-S" (see the SARAH::
    GRAPHICS notesfile for details under "Graphics Programming For MIS".

	These activities have been limited to off-work so far, I have been
    reading VAX BASIC and DATATRIEVE at work for my current job as Programmer/
    Analyst for Sales Training MIS.  With such a wide variety of languages
    available and all the resulting diversions of developed software which
    currently exist, it dawned on me several notes back (see "Tower of Babble"
    in .22) to develop a method of clustering languages into functional groups
    in order to assemble an application largely from existing code requiring
    minimal modifications.  This all started from an idea Tom Repstad (.19) had
    of developing a Universal compiler.  I'm not trying to rekindle this dis-
    cussion necessarily, I'm just trying to provide a framework for all this
    "babbling".


						-DAV0
87.50It pays to know your *real* goalsMLOKAI::MACKIt&#039;s the real world after allTue May 06 1986 16:0034
    Re .49:  Gee, I got my Ada manual "free" (finally got some of my
    	     tax money back) from the federal government.
    
    Re. Basic:  
    
    Actually, VAX BASIC is reasonably good on program structure as long as
    you program that way.  It is reasonably good on data structure as long
    as you insist on OPTION IMPLICIT=NONE (syntax?), forcing you to declare
    everything, and capturing anything you don't.  There are a couple of
    funnies with dynamic strings, but VAX BASIC programmers already know
    about those. 
    
    My only gripe about VAX BASIC is that it fails in one of BASIC's
    greatest historical strengths, as an interactive prototyping language.
    It burns me that I can't say X = 1 on one line and PRINT X on the next
    and get it to print 1.  I guess this is because VAX BASIC is in the
    "Commercial BASIC" class. 
    
    Most all languages can be used "in accordance with sound engineering
    methods", but it is easier to abuse those methods in some languages
    than in others, and in the heat of development, it is nice to have a
    little extra language support for "doing it right".  However, lacking
    that, management support for "doing it right", with the proper
    controls, adequate education, and "buy-in" by the group, can alone be
    effective, and these will be necessary even with a language change. 
    
    Your best bet is probably not to abandon BASIC, but to work on these
    angles.  They will accomplish what you are aiming toward more than a
    change of language will.  In that approach, you might be surprised in
    finding yourself in violent agreement with your management. 

    				Going through a similar process,
    
    					Ralph
87.51Meanwhile, Japan is putting 5th generation languages on chips.CSTVAX::MCLUREDavid McLureThu May 08 1986 03:1235
	I can handle using VAX BASIC for the time being.  There are plenty
    of issues I would like to discuss concerning BASIC (such as the use of
    line number labels versus text labels, implicit variable usage, etc.),
    but I think this discussion would be better served by sticking to the
    global issues concerning programming languages, and leave the nitty-
    gritty details for other more specific notefiles.

	Follow me for a second for a brief climb back up the stairs of the
    programming language "ivory tower"...what if I were to rephrase the
    question posed by this note to read: "Why don't we use the same language?".
    The U.S. military asked itself this question, and the result was ADA.

	The fact is that the future of Digital will depend more and more upon
    it's software than it's hardware (which has traditionally been the major
    money-making product in this industry).  This message was apparent to me
    from the key-note address of both Ken Olsen and Bell Cross at the recent
    IDECUS symposium.  Imagine the integration which would be possible if
    everyone could agree to use the same programming language.

	Obviously, we still need to support all the languages we do for the sake
    of our customers, but how about internal software applications?  What sorts
    of discussions would take place if one morning we arrived at work to find
    out that upper level management had decided to migrate all internal soft-
    ware applications development to one particular language, and to focus
    a great deal of effort on supporting as well as engineering new features
    for this language?

	Actually, there are efforts currently underway to standardize all
    future Digital applications architecture, but my question focusses upon the
    choice of a standard language which would allow Digital to progress into
    the future.  Could it be an existing language?  Or, would it be better
    to try to implement some sort of VAX-specific 4GL (or even 5GL) language?

						-DAV0
87.52Re-inventing the wheel but square this timeROYCE::DAVIESStephen M Davies &lt;nulli secundis&gt; Thu May 08 1986 04:2128
A few words from over the Pond...

BASIC, COBOL, FORTRAN etc are ok, but have their limitations in the modern 
world.
C is as mentioned elsewhere really a hackers script like UNIX itself
(if you don't agree re UNIX), don't reply here but in the UNIX/ULTRIX file)

CORAL-66, PASCAL, ADA, MODULA, are gainging popularity because they (when 
properly implemented), be made to do almost anything you want, but you have
to declare it first, thus taking out all the hidden/implicit stuff on that
implementation. Also D.O.D. ( U.S) &  N.A.T.O. are more likely to sign on
the dotted line if you use one of the above.

How about some comments on the following scenario.

You have just implemented a bit of H/W using a 68000 cpu and Software written
in C, using a Third Party compiler. You next job is to build a New Interface
to a new Bus. Most ( if not all ) of the existing interfaces use an in house
on board CPU, and in house developed tools, using a pascal like language for
the on board S/W.  Do you utilize your teams experience in C, but face all 
the H/W trickies in using the 68000 instead of the in house CPU ? or do you
not re-invent the wheel, use as much of the PCB layout of existing boards, and
as much of the Code from other boards as possible, but getting your team 
trained up on pascal ?
Any comments on this theoretical situation would be most welcome.

/Stephen

87.53A little more rounding on the wheelENGINE::BUEHLERJohn Buehler, Maynard MAThu May 08 1986 10:165
  As one person put it "All the semi-colon languages are the same".  Why
do we keep coming up with languages that are just a little bit more this
way or a little bit more that?  Is the effort put into developing an entirely
new language justified by what the new languages give us?  Why do we recreate
languages instead of enhancing them?
87.54All semi-colon languages not the sameMLOKAI::MACKIt&#039;s the real world after allThu May 08 1986 10:5041
    Actually, all the "semi-colon" languages aren't the same.  
    
    C is weakly typed. Weakly typed-languages generally allow data with the
    same machine representation to be interconverted, usually without
    explicit casting (type conversion). 
    
    PL/1 and PASCAL are both moderately-strongly typed.  Moderately-typed
    languages allow data using the same root language constructs (all
    integers, all strings, etc.) to be interconverted, usually without
    explicit conversion.
    
    ADA, MODULA, and C++ are very-strongly typed.  Very-strongly typed
    languages provide a mechanism for *users* to specify categories of
    data that are interconvertable without explicit casting.  
    
    Thus in a strongly-typed language, you can have two different float
    types derived from the same language data type (FLOAT) with the same
    machine representation (F_float, stored in a longword) that will not be
    interconvertable without explicit (casting, conversion, pick your
    favorite buzzword). 
    
    "But wait a minute!  That means if you aren't painstakingly careful
    about what you do with your data, you will get all kinds of compiler
    errors.  What a pain!"  That's right.  Languages like Ada and Modula
    aren't very good for prototyping or iterative development. For that,
    I'd pick something like C (for speed), or LISP or a good interactive
    BASIC (for the environment). 
    
    But for phased product development, where you are (at least theor-
    etically) designing before coding, and intend to produce a meticulously
    clean V1.0 product, compile-time errors are a lot easier (and cheaper)
    to trace and fix than run-time errors, so taking the time to define
    what data is interconvertable using a strongly-typed language is a
    win.  
    
    Obviously, of course, by not providing your own derived types and
    using only the system-provided types, you can use Ada or Modula
    as a moderately-strongly-typed language, but then you aren't using
    the language the way it was intended to be used.

    						Ralph
87.55More fat for the fire...AIAIO::REPSTADTom (Popeye was a Coastie) RepstadThu May 08 1986 13:0753
re: the differences in languages...

    So far the only differences in languages that have been discussed
relate primarily to the syntax of the language, and the kinds of 
data structures/type checking that is practiced. Let's bring into this
discussion some of the "hidden" attributes of languages. Like...

	Parameter passing mechanisms....

        Bonding Times....

   	Scope...


     The above features are all critical when designing a language, and
are usually defined in the language design phase. How do the various
implementations of the above concepts affect the way we program, and
the choice of which language to use. One of the things that frustrates
me about pascal it the way variables are scoped...There are times
when I like being able to use a record defined in the main program
in a subroutine without having to re-declare it, but there are other
times when you can run into trouble. Fortran by contrast provides
for global scoping only through the use of a common statement which
must be explicitly added to each subroutine. What are the pro's and 
con's here?

     Parameter passing between languages can be a real pain in the a**...
if you don't match things up, you get screwed up...I would like to 
see a modification to all languages by ansi that would allow for the 
use of the vax implementations of %descr and %val (not compilers
and or O/S's have this nifty feature!).

    Bonding is another issue, when should a language bond a variable
to it's type? If it's bonded at compile time, you can catch type
mismatches between modules if the linker is smart enough. Some languages
(SNOBOL?) allow you to change the bonding/type of a variable during runtime!
Is this a good/desireable feature? 

    I still support the concept of a universal compiler and/or a universal
language that incorporates ALL the features of all languages (what a project
that would be to work on...). Then instead of inventing a new language,
all you would have to do is create a new extension to the Universal
language. And I don't care what anybody says...I think it can and eventually
will be done!

    Well, that's enough fat for the fire for one day............



				--Tom


87.56Universal LanguageCSTVAX::MCLUREDavid McLureThu May 08 1986 15:5523
    	If a project did in fact exist to create a Universal Language
    (call it Unilang or something for short), then it would seem to me that
    it would have to be made up of various layers of access for the various
    types of users programming in it.  Much the same as the ISO communications
    model is made up of seven layers, so would Unilang.

    	This would allow different standards to be set up for each level
    of implementation, as well as provide for future generations (i.e. 4th,
    and 5th generation language enhancements).  Each level would provide
    a specific grouping of hardware dependant access, along with accompanying
    performance requirements, as well as providing artificial intelligence,
    prototyping capability, and user friendliness as attractive features
    of the higher levels.

    	Other language modules could be applied to the desired level of
    Unilang as specified in the ISO-like model, as well as other hardware
    configurations.  Chips could eventually be implemented in some of the
    agreed upon layers as they become standardized.  Yes, it would be quite
    the project!


    						-DAV0
87.57OK, who'll implement VAXUL?ENGINE::BUEHLERJohn Buehler, Maynard MAThu May 08 1986 18:3824
RE: .54

  Please don't be offended, but it sounds like you're too close to your
programming to really see that the languages you mentioned all fall into the
same "semi-colon" category. I'm completely aware of all the points you made,
and they just don't separate the languages *that much*.  They're primarily the
same stuff rehashed. Call them block-structured languages, if you like. 

RE: .56

  I really like the idea of a layered language, assuming that I can get
to any given layer on a procedure-by-procedure basis.  I don't think I'd
want a line-by-line implementation, though (I don't think anyone would want
to implement it that way, either).

  What you say about converting layers to hardware is a definite possibility.
Just use graphics as a model.  As the lower level algorithms become more
and more stable, they get implemented in hardware.  I'd love to work on
a machine that supports 50% of my language operations directly in hardware.
The VAX is pretty good with high-level operations already, though.  Yeah!
The more I think about it, the more I like it.  Now somebody is going to
reply with a fire hose approach and put out the fire.

P.S.  I really like that idea, David (did you think it up?)
87.58sounds familiar...BACH::VANROGGENThu May 08 1986 19:1528
    Of course, after there have been hardware implementations of some
    of these lower layers, someone will come along and say that with
    a simpler language which is easier to implement, it's possible to
    have languages that run faster, since most of the time is spent
    doing simpler operations, and that allow for more optimization spent
    on other language features.  Sound familiar?
    
    There already are a number of languages which "integrate several
    programming paradigms", primarily AI-related concepts. These have
    generally been built on top of Lisp, and allow use of:
      traditional procedures and data structures/types;
      functional programming;
      symbolic representations;
      object-oriented programming (inheritance);
      data-base storage;
      forward and backward inferencing;
      meta knowledge;
      demons;
      constraints;
      non-monotonic logic.
    Hmmm, I've probably missed some other features (any other buzzwords
    I've forgotten? :-).
    
    And those are just the "language" features; they usually have all
    kinds of interface tools and "workbenches".
    
    			---Walter
    
87.59Instructions , ROM, or RISC ?ROYCE::DAVIESStephen M Davies &lt;nulli secundis&gt; Fri May 09 1986 12:459
Re the last part of .57,

The next question from this bit is do we add all these good primitives as
instructions ? ,or do we follow thye RISC approach , and execute them like
a bit of ROM ?

/Stephen
    

87.60A picture is worth 1000 wordsCSTVAX::MCLUREDavid McLureSat May 10 1986 05:1020

                         VAX UNIVERSAL LANGUAGE MODEL

               +------------------+         +------------------+
               |  INPUT   OUTPUT  |         |    PROCESSING    |
               +----v<<<<<<<^-----+         +----v<<<<<<<^-----+
--------------------v       ^--------------------v       ^--------------------
                    v       ^   5th Generation   v       ^
--------------------v       ^--------------------v       ^--------------------
                    v       ^   4th Generation   v       ^
--------------------v       ^--------------------v       ^--------------------
                    v       ^   3rd Generation   v       ^
--------------------v       ^--------------------v       ^--------------------
                    v       ^   2nd Generation   v       ^
--------------------v       ^--------------------v       ^--------------------
                    v       ^   1st Generation   v       ^
--------------------v       ^--------------------v       ^--------------------
	            v	    ^<<<<<<<<<<<<<<<<<<<<v       ^
	            v>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>^
87.61Sallying FORTH!COGITO::STODDARDPete Stoddard, the METALS toolsmithTue May 13 1986 17:1915
    Continuing on the subject of layering languages, (and to mention
    my at home hacking favorite,) FORTH has been discribed as "a
    terrible applications language, but a great language for implementing
    applications languages".  Also, FORTH has already been implemented
    in silicon on a single chip computer.  It can be both an interpreter
    and a compiler, generates very compact, very fast code, and is
    infinitely extensible.  As far as using it to implement languages
    goes, there has been at least one LISP written in FORTH.  It does
    have the property, however, that you either love it or you hate
    it.  Religion strikes again.
    
    Any FORTH lovers (or haters) care to comment?
    
    					Have a GREAT day!
    						Pete
87.62In search of a base layer...CSTVAX::MCLUREDavid McLureWed May 14 1986 02:4013
    	I used to play around with FORTH on my brother's commodore-64.
    I can see why LISP is done in FORTH, they are very similar.  I can
    say that I am "in-like" with both languages and think they would work
    well as models for some of the lower layers of VAXUL (thanks for coining
    the name John - in 87.57).

    	Whether either of these languages would negate the need for a
    layered, ISO-standardized language providing all of the features
    for all of the different types of programming needs that will exist
    in the present and future remains to be seen.
    
    							-DAV0
87.63DREGS::BLICKSTEINDaveThu May 15 1986 13:4419
    re: .57 "John Buehler"
    
    I call them "FORGOL".  They all appear to me to be variations and
    one-pluses of FORTRAN and ALGOL.  Some people see things like typing,
    scoping, etc (all characteristics of FORGOL family languages) to
    be big differences, but to me all those languages force you to model
    the problem in the same way and how the problem and the solution
    is modelled is what is most interesting to me.
    
    Examples of languages that require different modelling are SETL,
    OPS5, and to somewhat lesser extents, APL (especially with nested
    arrays, generalized operators and user-defined operators)
    and LISP.

    We need to figure out ways to directly specify what the answer is,
    and not just diddle and daddle with ways that specify how the answer
    is computed.
    
    	db
87.64You can write Fortran programs in (almost) any languageLATOUR::AMARTINAlan H. MartinFri May 16 1986 00:1011
Re .63:

It is possible to write Fortran programs in SETL, APL and LISP.  (I don't
know about OPS5.)  Those languages don't demand a different approach,
they merely permit it.  It's probably harder to write Fortran programs
in Prolog or FP, but it still might not be impossible.

Do you have anything to share about the great variation between programs
written by different individuals, in the same language, which attempt
to solve the same problem?
				/AHM
87.65Sprechen Sie FORGOL?ENGINE::BUEHLERJohn Buehler, Maynard MAFri May 16 1986 09:4938
RE: .64

  I am amazed (and, at times, appalled) by the variations in how people
approach the solving of a design problem.  The variations are from
near-brilliant to ridiculously bad (opinions, obviously).  I know that when I'm
designing, I'm forming a mental image of the data structures, hardware,
interfaces, algorithms, etc, that I'm employing.  This allows me to juggle them
'physically', which makes things very easy for me.  I'm a heavy PASCAL
programmer, and have finally gotten to the point where I can think in PASCAL
(for better or for worse), which allows me to 'ignore' the language, and just
do the coding of the design on the fly (no stopping to find out if the
languages supports such-and-such a feature, etc).  I'm sure that some of my
designs are less than they could be because I'm so bound to PASCAL.  'Thinking
in one language' can be bad at times.  This probably applies to real languages
as well (e.g. French, etc).

  Now I'm bound to the language, and that's one reason I'll support PASCAL
as the 'FORGOL' language of choice (getting back to the original topic).
I can *think* in the language.  I have all the capabilities of PASCAL in
my little head, and don't have to think about what the language can do for
me, I just design within the known limitations.  But that means that after
I've designed a couple of hundred programs in PASCAL, I could probably write
a program to do my thinking for me because my designs will have gotten so stale
that I wouldn't want to think that way anymore.

  Then come other languages, which allow you to think in a different language.
Well, first you fight with it, *then* you think in it.  But there, again,
you're back to thinking in one language.  Eventually, you'll get stale again,
unless you do a FORGOL quickie once in a while.  This is the reason I like
VAXUL - the universal, layered language mentioned several or more replies
ago.  You're not bound to a small subset of functions.  The number of layers
would never be limited.  It's not that hard to conceptualize LOPS5 (to coin
another term - this one for AI languages) on top of FORGOL on top of MACRO.
If I can still go after all layers, imagine the possibilities.  The number
and type of designs will increase by many times.  Learning the one language
would be kinda rough, but once you've got it, all you have to do is pick
up the new layers as they come along (and hope nobody completely replaces
VAXUL with the next invention down the road).
87.66DREGS::BLICKSTEINDaveFri May 16 1986 18:1438
    I would say that to programmers who understand those languages,
    that they "suggest" a different approach.
    
    I understand perfectly what you are saying though.  I once had to
    grade student programs for an assignment to write a program to compute
    averages in FORTRAN, LISP and APL.   Needless to say, most of the
    APL programs did this by a loop, and most of the LISP programs used
    a PROG, which (as was intended) indicated that those students hadn't
    grasped the fundamental differences of those three languages.
    
    Regarding your second question, I tend to think that substantial
    variations within one language (not things like different variable
    names, etc.) tend to indicate that the language is either inefficient
    for representing the problem, or that the language has duplicated
    features.
    
    I'd like to think that in the ideal language there is a clear and
    obvious 'right' way to specify a particular thing.   That is to
    say that the nature of the language suggests doing things a particular
    way.
    
    Unfortunate APL is poor in this respect (although by comparison,
    some of the kitchen sink languages like PL/I and Ada would then
    be "notorious").   One mark against APL in the classic (and silly)
    "readability question" is that there are dozens of "right" ways
    to do many common things.  I believe that in most of those cases,
    it results from mapping a strict matrix model onto something that
    is not intrinsically a matrix.  This problem is improved somewhat
    by the nested array extension and the increased flexibility of
    representation nested arrays allow, but it will always be a sore
    point as long as APL functions operate principally on arrays.
    
    	db
    
    P.S. Some misguided person once made the mistake of giving APL
         FORGOL like features (control structures, declarations, etc.)
         It generally went by the (appropriately enough) name "APLGOL",
         although it never took off.
87.67I wrote .64 while in an algorithmic moodLATOUR::AMARTINAlan H. MartinFri May 16 1986 19:0622
Re .66:

Actually, I was thinking about the range of entirely different algorithms
that people use to solve the same problem, not identical algorithms
expressed differently because of overlapping language features.  Because
I had just come out of a lecture on "The Science of Programming" by
David Gries, my mind was still on some of the more beautiful solutions
he showed to some nifty little problems he posed.  I was so blown away
by the simplicity of them (even when I devised something with the same
time complexity, it was still comparitively ungainly) that coming back
to a discussion of different languages seemed of minor impact.

While I'm sure that language choice can make or break a project, I was
skeptical that an APL or Lisp programmer would have been more likely
to come up with some of those solutions merely because their brain was
wired for applicative or functional style.  24 hours later, I'm still
recovering.

BTW, maybe my eyes glazed over, but I thought I saw "my control structures
for APL" articles in SIGPLAN Notices for several issues in a row during
the past few years.
				/AHM/THX
87.68Simplicity and elegance via abstractionCSTVAX::MCLUREDavid McLureSat May 17 1986 17:3226
re. 67		I'm not surprised that simplicity "blew you away" compared
    	to this discussion.  I think it is crucial that we establish a common
    	level of abstraction from which to build our theoretical model.

    	Now, here is a Catch-22 abstraction to define a universal language:
    "A universal programming language is one in which any computable function
    can be expressed as a program.  A function is computable if it can be
    expressed as a program in some programming language." (taken from "Program-
    ming Languages - Design and Implementation - 2nd Edition" by Terrence W.
    Pratt, page 350).

    	A quick history of simple abstract machines (or Automata) reveals that
    of the mathematician Alan Turing [1936].  The Turing Machine contains a
    single data structure made up of a variable length linear array (the tape)
    which consists of an unlimited set of single, read-write characters pointed
    to by a single pointer variable (the read head).  Branching is achieved by
    constructing a new address on the tape based on the read head (which can be
    directed to the right or left of the current position), and the value read.

    	A language which programs the Turing Machine is defined as universal
    by the mathematician A. Church in his thesis which claims that "any comput-
    able function can be computed by a Turing Machine".  How about using this
    as a theoretical model for subsequent VAXUL layer definitions?

    						-DAV0
87.69DREGS::BLICKSTEINDaveSat May 17 1986 20:3928
>BTW, maybe my eyes glazed over, but I thought I saw "my control structures
>for APL" articles in SIGPLAN Notices for several issues in a row during
>the past few years.

    Yes I've seen those.  But try and find such proposal in any recent
    issue of "Quote Quad", the SIGAPL periodical.
    
    You have to remember that the SIGPLAN notices are an unrefereed
    publication.  I don't know if my reaction to those articles typifies
    those of the knowlegeable Quote Quad reader, but I consider proposals
    like that to be written by people who are APL "lightweights" with
    limited undertanding of the language, who probably think that the
    way to "fix" APL is to make it look like FORGOL.
    
    I would NOT say the inability to do certain things because of the
    absence of structures ISN'T a weakness in APL.  I would rather say
    that if it can't be efficiently represented in vectors (or nested
    arrays), then chances are that APL is not the appropriate language
    for the task.   That is, I would rather see APL maintain its high
    level of conceptual integrity and remain a language suited for many
    tasks, than to see it lose its consistency by going the "kitchen
    sink" route and try to be suited for ALL tasks.

    Bottom line is that record structures and control structures are
    inappropriate (and in most valid APL applications unnecessary) for
    APL.  Nested arrays are a much better fit.
    
    	db
87.70A goal-oriented approachMLOKAI::MACKIt&#039;s the real world after allMon May 19 1986 17:4641
    OK.  There are a number of conflicting goals described here that we can
    seek to simultaneously maximize in a "perfect" language. Perhaps a lot
    of our personal favorite languages are based on our judgements of which
    desireable attributes we can afford to give up first. 
    
    Three that have come up are:
    
    1.  Conceptual integrity -- a state achieved when there is precisely
    	one correct way to do anything that you might consider doing.
    2.  Elegance -- when a language provides a small, distinct, and
    	non-overlapping set of primitives, selected to be sufficient
    	for any task when used in combination.
    3.  Generality -- it should be usable to solve any problem that
    	can be solved in machine language on any known ("real") machine.
    4.  User-extensibility -- the user should be able to extend the
    	language.  His extension should immediately inherit the full
    	support available to any comparable intrinsic language feature.
    
    For software engineering, there is another:
    
    3.  Software Engineering Decision Support -- a facility built into
    	the fabric of a language that allows auto-tools to make high-level
    	decisions and observations about the code, its underlying design,
    	and the problem it was supposed to solve.  

    One thing which is a baby-step in this direction is strong data
    typing.  It allows you to specify categories (types, classes) of
    data based on information in the design rather than just in the
    virtual machine implementation of the design.  
    
    What is really needed is sufficient information in the implementation
    that we can generate the design that it *really* implements and then
    compare that with the original design for discrepancies.  That kind of
    power is a long way off. 

    Can anybody think of any other attributes of the perfect language?
    
    (Note that I am not looking for features here, like strong data typing,
    but attributes that such features might seek to promote.) 
    
    						Ralph
87.71The perfect language would need a personal compilerCSTVAX::MCLUREDavid McLureWed May 21 1986 03:4341

re .70
>    Can anybody think of any other attributes of the perfect language?

	The perfect language would need a perfect language compiler/interpreter
    which would go one step further than even the most interactive compilers
    by including features similar to the Datatrieve "Guide-Mode" but even more
    intelligent.  By conversing with the compiler for awhile, it would get
    to know you on a more personal level (i.e. you create your own language),
    and then it remembers your UIC so that next time it can talk to you on
    the same level of understanding without having to reset it's expectations
    (similar to maintaining one's own personal data dictionary).

	The first conversation (session) with the compiler, whether interactive 
    or via command files, would form the compiler's first impression of the
    particular user (or device).  Subsequent conversations might redefine
    certain aspects of the first impression, but would most likely elaborate
    upon existing topics and dig deeper for details (forcing a top-down
    approach to programming).

	Instead of data-types, the compiler could base it's impressions on
    stereotypes (based on peer group) which would remain in effect until
    redefined as a result of future conversations.  Certain common traits
    of both users and devices could be initially stereotyped by the compiler
    as defaults, with adaptive revisions occuring as needed.

	This conversation-mode would be most apparent among the higher layers
    of the language, (ready for an eventual DECTALK/Natural Language Inter-
    preter extension), but the lower layers could also relate to their peers
    in the same way.  This would allow multiple-layer access to the compiler
    based upon the individual relationship developed with the compiler.

	To avoid embarrassing situations (i.e. syntactic or run-time errors),
    the compiler would politely ask the user just exactly what in the Sam H.
    they meant and walk them through various levels of debugging.  Eventually,
    the compiler might indicate that it had grasped the concept of a particular
    statement or program.  This would mean that there would be no such thing
    as a programming error, only issues requiring further discussion.

						-DAV0
87.72some compilers are customizableBACH::VANROGGENWed May 21 1986 09:3812
    Re: .71
    
    Sounds interesting. There have been some steps taken in this direction
    by various groups, but I'm not aware of any project to do everything
    you suggest.
    
    Most lisp compilers consist of a large body of knowledge which is
    easily user-extensible. Since lisp is interactive, it's easy for
    someone to add specific compilation knowledge about his favorite
    functions as he's developing.
    
    			---Walter
87.73What is a 'good' language?VOGON::HAXBYJohn Haxby, IPG/Ultrix, ReadingTue Jun 10 1986 07:33107
    I've just added this conference to my list of reading material for
    everyday fun, and this topic is a goody ... be prepared for a long
    read as I have just read a large proportion of the replies here
    and I have some answers, abuse and general comments.
    
    For those of you who  don't know me (that's all of you I suspect)
    my background is largely language/compiler oriented; some theory,
    some practice, some programming.  So, lets get on with the abuse,
    I mean comments.
    
    First, some popular misconceptions that have appeared in the notes
    wot I have read:
    
    Basic is a good interactive language.
    For those of a sensitive disposition, I apologise for using 'good'
    and 'Basic' in the same sentence.  There is a system called the
    CAP in Cambridge which uses Algol68 as its command language.  The
    Bourne shell on Unix uses a similar sort of thing (without practically
    all of the power) and both leave Basic standing.
    
    C is fast.
    Apologies, again to the sensitive.  I have had seven years of C
    programming and I hate it.  It is not fast -- high-level assembler
    (which is what C is) is not inherently quick.  Just because there
    is a one-to-one mapping from C statements to machine instructtions
    does not make it quick.  Real high level languages are so much quicker
    it is untrue.
    
    Ada is a high-level language.
    It is not.  It is crap.  I have read the book, talked to some compiler
    writers, seen the bugs.  I have not seen a full and working Ada
    compiler and I don't suppose I ever will; the language is
    unimplementable (like Algol60 was).  I believe the Ada design committee
    (or whatever they call themselves) are considering a complete
    re-design, hopefully they will get it right.
    
    There exists a universal language.
    (Although it hasn't been designed yet.)  Not at all true, have you
    counted the number of languages under Unix recently?  Let me give
    you a sample: Bourne shell, C shell, Awk, sed, ed, troff, tbl, pic,
    eqn, lisp, C, fortran, prolog, bc, dc, icon, apl, algol68, algol60(?),
    assemblers (many), em, pascal, modula-2, clu, yacc, lex, grap, C++,
    fp, poplog.  I've run out -- there are more and doubtless I'll think
    of them after I've finished this.  But can you, any of you, come
    up with a design which embodies all of the concepts in these languages?
    Trying for a universal language, layered or otherwise is just not
    on.  We don't know enough about languages to even make a good guess
    at it.  Look at Ada...
    
    
    
    Now for some opinions about what makes a good language.  I must
    confess to stealing other peoples opinions here as well, without
    acknowledgement as well.  I'll present my opinions without much
    qualification otherwise I'll be here all day.
    
    Languages should be strongly typed.
    I don't care what the application is, strong typing gives the compiler
    chance to produce really good code (it knows what you are doing)
    and gives the porgram less chance of falling over.
    
    Garbage collection.
    Definitely, a "must have", makes programs go faster.  There's none
    of that tedious mucking about freeing space (or forgetting to, or
    freeing by accident) whenever you have finished with it, or think
    you have.  In a multi-tasking environment the garbage collector
    is run as a low priority process so you never even see it (unless
    the system suddenly panics and needs a lot of space, but we aren't
    talking about real time operating systems here, and even if we were
    I would still advocate a garbage collector).  You are less likely
    to write programs that fall over if you aren't worrying about space
    allocation.
    
    String handling.
    Not just string handling.  Sensible operations like 'add' between
    any non-basic types, anyone can add integers or floats, but I rarely
    use numbers in anger - it's much better to be able to say things
    like 'plural := singular + "es"' or 'begin_symbol := 
    symbol_table ["begin"]' -- the last thing I want to do is worry
    about allocating the right amount of space for 'plural' or noticing
    that all I need to do is change the pointer repesented by singular,
    free up the old value and then had 'plural' and 'singular' pointing
    to the same place but with different lengths.  Ditto symbol tables,
    I know what hash tables are and how to do them.  I don't really
    want to, I would rather the language lets me get at a nice package
    in a nice way to do it for me, efficiently, if that's the way it
    wants to do it.  Who knows, it may know about some fancy machine
    instructions or a hardware hash algorithm that I don't.
    
    Semi-colons should be shot.
    Ditto other small syntatic marks like full stops, commas etc.  I
    don't think it is possible to get rid of all small syntactic marks
    without making Algol-like programs unreadable.  Lisp would be better
    without brackets though.  But semi-colons as statement separators
    are definitely redundant.  Come on, admit it, how many of your syntax
    errors are due to a missing or spare semi-colon (or equivalent)?
    The obvious and flippant answer is to design a language that doesn't
    *need* statement separators.  It is possible, and usable and it
    does indeed make trivial syntax errors less common.
    
    
    
    I must have got someone's back up with all this, or someone might
    even agree with me.  If you argue with me, I might even be able
    to justify some of my more way-out comments!
    
    						JCH
87.74Back to the tower!CSTVAX::MCLUREVaxnote your way to ubiquityTue Jun 10 1986 12:3776
    	Yay!  This discussion lives on!  I was beginning to think that
    the tower of babble had crumbled to the ground and everyone had gone
    back to their own respective languages for another 1000 years (myself
    included).

re. -1
    	I think I may agree/disagree with many of your points, but I'm
    still scratching my head to figure out which side you're arguing on
    for many of them.  It seemed that you began to address certain previous
    replies in this note, I am right?  Or were you just throwing out some meat
    to the lions?  I hate to cramp your style, but I think it might be a
    little easier to identify your answers from your general comments
    by using the ">" symbol.  For example:

>    I've just added this conference to my list of reading material for
>    everyday fun, and this topic is a goody ... be prepared for a long
>    read as I have just read a large proportion of the replies here
>    and I have some answers, abuse and general comments.
 
    	Sorry, I know how much you hate syntax, but I'm betting that you
    stand a better chance of getting this disscussion rolling using this
    method.  I'm not sure who thought it up, but it works for me.  You
    might want to check the notesfile HUMAN::ETIQUETTE for other such
    goodies.  (:-)

    	You mentioned Basic, I personally wish you hadn't.  I'm currently
    forced to use it and have a goal in life of obliterating it from the
    face of the earth.  I'm going to Colorado Springs next week to inter-
    view for a job which would involve just that (upgrading existing Basic
    code to Pascal).  I just got through spending three days debugging
    a Basic program which seemed to be eating data, it turned out that it
    was a misspelled implicit variable (ARRRGH)!

    	As far as C, you may have a point there, I have always held C
    in high regards until I began to understand just how cryptic it
    (and *Unix for that matter) can be (yuck).  I haven't given up on
    it yet, I've just placed it on the shelf alongside the rest of my
    collection of square bowling-balls.

>    Ada is a high-level language.
>    It is not.  It is crap.  I have read the book, talked to some compiler
>    writers, seen the bugs.  I have not seen a full and working Ada
>    compiler and I don't suppose I ever will; the language is
>    unimplementable (like Algol60 was).  I believe the Ada design committee
>    (or whatever they call themselves) are considering a complete
>    re-design, hopefully they will get it right.
    
    	I can't say that I have yet written an Ada program, but I'm planning
    on it (for fun and excitement).  I wonder why you seem to think that
    it can't be done?  It has been done - very well (I read in the marketing
    literature) on VAX/VMS at least.  I noticed from your C++ note that you
    have an older version of VAX/VMS (2.3?), maybe you need to check into
    an upgrade before jumping to any conclusions about Ada.

    .
    .
>    Trying for a universal language, layered or otherwise is just not
>    on.  We don't know enough about languages to even make a good guess
>    at it.  Look at Ada...

	A universal language is not an impossibility, in fact has existed
    for years (in theory at least - see my note 87.68).  Ada was a nice
    try, but I think WE noters can plan out a layered language which allows
    for future generation layers to be added on top and finalized layers to
    be embossed in silicon on bottom.
    
	I agree with you on strong typing, garbage collection, and
    string handling, but would be even more pleased to see compiler
    switches to turn these things on and off.  At the higher levels,
    the compiler could be interactive enough to ask the user what they
    really "meant" be any ambiguities and/or use a personal user profile
    help determine what a particular user's programming style would be
    conveying.  With that, I will return to my pipe-dream and let the
    replies start flying in.

						-DAV0
87.75$.02CIM::JONANTue Jun 10 1986 12:5425
    
    Re: .73
    
    "Ada is a highlevel language.  It isn't. ..."
    
    There may be some good reasons to critisize Ada but none can be
    found in this paragraph.  Clearly you haven't tried VAX Ada (or
    even its decent competitor, Verdix on Ultrix).  VAX Ada is a very
    mature product, even though it is still at version 1 and implements
    if not everything in the LRM, then certainly most of it (certainly
    all of the requirements plus a good deal of the "options").
    
    "A universal compiler is a lost cause" (paraphrased)
    
    Yes, I can think of a language that can express all of the concepts
    in those listed and alot more.  Namely, English or most any other
    natural language (What are those languages' specifications and
    requirements written in anyway??)  I used to think that there was
    no way that a "true natural language processor" could be written.
    I'm still not convinced, but I wouldn't say that it is a lost cause
    to come up with a "universal compiler" based on (a) natural language.
    (And this coming from a mathematician and logicist - what's the
    world coming to anyway?!)
    
    /Jon
87.76Keep 'em coming...CSTVAX::MCLUREVaxnote your way to ubiquityTue Jun 10 1986 14:0115
re. 87.75,

	If you are interested in natural languages (and this goes for
    everyone here), you should check out the CDR::NATURAL_LANGUAGE_COMPUTATION
    notesfile.  If nothing else, put your name on the interest list (note 2),
    and start recieving the mailing literature which the moderator has
    promised to send out (I'm still waiting...but I'm patient).


re 87.73,

	Ignore what I said about the use of ">" in noting.  I must have
    been in one of my anal moods.

						-DAV0
87.77BACH::VANROGGENTue Jun 10 1986 15:077
    Saying that some "natural language" like English provides an
    existence proof of a universal language is just falling into
    the Turing tarpit and is of no constructive use.
    
    Ease of expression and conciseness are what distinguishes
    languages, not theoretical power (which is assumed).
    
87.78Turing tarpit???CSTVAX::MCLUREVaxnote your way to ubiquityTue Jun 10 1986 16:377
re. -1,

	Ok, why is the Turing machine a tarpit?  Is it because by basing
    future language constructs on an antique machine such as this, you
    may find yourself "stuck" in an antique architecture?

						-DAV0
87.79still arguing conflicting casesVOGON::HAXBYJohn Haxby, IPG/Ultrix, ReadingWed Jun 11 1986 06:24120
    As I started all this I ought to justify some of my more outrageous
    claims and disagree with what I said (Dave was right in .74, I wasn't
    arguing from either side of the fence).
    
>						... I think it might be a
>   little easier to identify your answers from your general comments
>   by using the ">" symbol.
    
    Notwithstanding .76, Dave was right.  I apologise for the confusion,
    the reason I didn't comment things was that I was answering so much
    the comments would've got in the way.
    
    Re Ada:
      I am a language purist, I like some languages for what they are
    even though I have never programmed in them, I dislike others
    similarly.  My favourite fault with Ada (which may or may not have
    been fixed, I forget) is that when a task dies improperly, it takes
    its ancestors with it.  Fine, except that one of the ancestors is
    the operating system kernel...  There are others, but I will talk
    to my friendly Ada expert before sounding off about them.
    
      The implementations of the language aren't of much interest, from
    my side of the fence, until the language is well defined and reasonably
    secure (jargon term there, sorry) it isn't (again from my point
    of view) trying to build a useful compiler/kernel.
    
    
    Re. Turing machines as univeral languages.
    Correct.  You can do anything with a Turing machine, that is provable.
    I did it once as part of a post-graduate logic course.  The only
    other thing provable about Turing machines is that you can't prove
    anything else.  That means you can't tell if the machine will stop
    (the "halting problem"), you can't tell if it will generate the
    correct results (corollary to the halting problem) and have you
    ever tried to write Turing machine 'programs'?
    
    As it happens, most languages can be shown to be equivalent to Turing
    machines, usually by implementing the original definintion in them
    (for that see .68, it's pretty short).  The most common that isn't
    a Turing machine is 'pure' lisp, that is, lisp without the 'setq'
    (or 'set') which is useless (almost) but very pure and generates
    provable programs.
    
    English as a universal (programming) language
    (You are welcome to skip this paragraph...)
    And coming from a mathematician, I'm ashamed!  The are four types
    of language (defined by Chomsky in '36 or '56 I forget which, anyway,
    he's still kicking around).  Type 3 (regular expressions) is easy,
    they are context insensitive and can't count (ie can't match brackets).
    Anybody can do regular expressions.  Type 2 is context insenstive
    and can count pairs of things (BNF to you).  Type 1 is a goody,
    it is context sensitive and can count any number of things (eg matching
    triples; context sensitivity lets you have declarations as part
    of the grammar).  You can specify Type 1 languages with Wijngaarden's
    two level grammars, like wot Algol68 was specified.  Type 0 is context
    free, you can do what you like with it.  English is Type 0.  The
    upshot is that Type 3 grammars are easy to parse, Type 2 a bit harder,
    Type 1 very tough, and Type 0 practically impossible.  Little and
    famous expample to show how difficult English is to parse:  "This
    pen leaks".  Obvious, huh?  As it happens it is in a book about
    pig farming underneath a picture...
    
    Universal programming language.
>   "A universal compiler is a lost cause" (paraphrased) (.75)
    I think language is meant here but I will skip that; certainly I
    meant (programming) language originally.  We can forget Turing
    machines, they are a pain.  Almost but not quite useless.  What
    we really need is a language we can prove things about.  I don't
    care what the language is, just so long as I can prove things about
    it.  It is not enough to test a program (suite) and say it works,
    it'll be out in the field for maybe 10 minutes before someone finds
    a bug, hence all the Beta-site tests, minor maintenance releases
    of, eg VMS.
    
    Occam isn't a bad language, it is provable, easy to implement and
    lends itself rather nicely to multi-processor environments.  It's
    almost as bad as lisp for its syntax, but never mind.  The real
    trouble is that it is too low a level to do anything useful with
    it.  So, next stage - use it as an internediate code.
    
    There are several things you could compile from: Pascal, C, and
    and languages of that ilk.  The awful parts of both languages (which
    cause the unprovability) simply wouldn't compile (trivial result:
    you cannot generate a provable program from an unprovable one, though
    that in itself isn't provable!)
    
    Even better, you could use a real high level language to compile
    to Occam, eg CLU, Algol68RS.
    
    See, look, I have a language system which is layered, cross-compilable
    and portable.  I can mix CLU, '68, C and, if desperate, Pascal at
    will and it will work.  Otherwise it won't compile.  Slight pipe-dream,
    alas, I don't think Occam supports some of the fundamental features
    of those languages, but I don't have a definition anywhere to hand.
    
    
    There is another 'universal' language around.  It's called "satan"
    (its predecessor was call "lucifer").  It's an extensible language
    of the sort one would like Forth to be.  You actually specify new
    grammatical constructs in existing ones, not to mention the ones
    you are defining at the moment.  I don't know a great deal about
    it, except that the underlying concepts are sufficiently powerful
    to give me headaches.
                                               
    
    While I was waiting for notes to print the messages that came in
    overnight I was scribbling down a rough sketch of the layers of
    a layered 'universal compiler' (I'm beginning to hate that phrase
    as it doesn't seem to have any meaning, but I can't think of anything
    better.)  I noticed, after I had finished, that I was desribing
    something like an uprated version of the Amsterdam Compiler Kit
    but I had Occam and/or Forth in there as an intermediate language
    above the 'machine independent' assember, em.  Just thought I'd
    mention it...
    
    
    						jch
    
    
    PS Sorry its a long note again.
87.80MYCRFT::PARODIJohn H. ParodiWed Jun 11 1986 12:0410
Re: .74

>    I just got through spending three days debugging
>    a Basic program which seemed to be eating data, it turned out that it
>    was a misspelled implicit variable (ARRRGH)!

Well, that's why they invented OPTION TYPE = EXPLICIT.  Using this statement
would have reduced your debug time to three seconds...

JP
87.81no, that's why they invented declarationsVOGON::HAXBYJohn Haxby, IPG/Ultrix, ReadingThu Jun 12 1986 08:4913
    That's no excuse.  Languages which don't have declarations should
    also have program length severely restricted.  Declarations were
    invented to avoid programs failing due to identifiers being declared
    without anyone knowing about it.  As to where declarations should
    go, that's another story...
    
    						jch
    
    
    PS Declaration-free languages tend to be invented because they are
    for short snappy things.  Trouble is, they wind up being far more
    useful than originally intended and you get big programs: ever tried
    debugging tpu or mlisp files?
87.82More Ramblings...CIM::JONANThu Jun 12 1986 12:16107
Re: .77

        In NO WAY was I offering an existence proof of a "universal compiler"
    (by natural language analogy or otherwise)!!!  Also, I have no idea
    what the quaint phrase "Turing tarpit" refers to (see (possibly) below...).


Re: .79

>   My favourite fault with Ada (which may or may not have
>   been fixed, I forget) is that when a task dies improperly, it takes
>   its ancestors with it.  Fine, except that one of the ancestors is
>   the operating system kernel...  There are others, but I will talk
>   to my friendly Ada expert before sounding off about them.
    
        When tasks die (or terminate normally) they do not take their ancestors
    with them.  In fact, the terminated task will not even deactivate until
    all of its *dependents* also termninate.  This is clearly stated in
    section 9.3 and 9.4 of the LRM.  Also, as far as I have been able to
    tell the language definition is and has been stable since 1815A was
    published in 1983 (of course there is the usual tweeking being considered,
    but no major overall).
    
>   Turing machines as univeral languages.

    Turing machines cannot recognize unrestricted grammars (see below).
    Other Turing machine notes:  this business of provable is completely
    dependent on the axiom system that you're working in.  The so called
    "halting problem" is the Turing equivalent of decidability in First
    Order Predicate Calculus.  And this is inextricably tied up with the
    business of self referencing sentences of the language.  Of course this
    leads into the fascinating subject of Model Theory and The Incompleteness
    Theorem of FOPC.  But, this stuff is better placed in the mathematics
    notes file....


>   As it happens, most languages can be shown to be equivalent to Turing
>   machines.

    Not true, most programming languages are described by Context Free grammars
    which are not as general as those recognized by a Turing machine.  Of
    course, if you mean you can write any Turing equivalent program in them
    then you're correct, but even assembly language has this feature.
    (Strictly speaking, this isn't true, you would need infinite resources
    to do this; by definition the tape in a TM is unbounded AND linear!)

>   There are four types
>   of language (defined by Chomsky in '36 or '56...

        I believe that you have alot of this backwards.  First it was in 1965
    that Chomsky conjured up his four classes ('36 was the year Turing conjured
    up the TM). (Boy, this reply is getting long!)  Anyway the classes are
    divided into those that are *phrase structured* and those which aren't.
    Also, the classes refer to *grammars* NOT languages (a language may
    be described by many different grammars, possibly all of the same class)
        Basically, phrase structured means that a grammar is defined by
    a four-tule <N,SIGMA,P,S>, where SIGMA = a terminal alphabet,
    N = a nonterminal alphabet, P = set of productions, and S = some start
    symbol.  And such that SIGMA and N have empty intersection and
    y -> x IN P implies y & x are in the union of N, SIGMA and the empty string
    and y contains at least one element of N.
        The classes are:

    Unrestricted grammars.  These are the most general and are not phrase
    structured.  Natural languages fall in here.

    The others are *context-sensitive*, *context-free*, and *right-linear*
    (or *regular*) and are all phrase structured.

    The most general is context-sensitive and is characterized by x -> y
    implies that |x| <= |y| (|x| = Length of x).  Hence, y can't be empty_stg.

    The next most general is context-free and is characterized by x -> y
    implies that x is an element of N.  Here y may be empty and thus CFG's
    containing a production A -> empty_stg cannot be context-sensitive.

    The least general is right-linear (or regular), characterized by each
    production in P having the form A -> xB | x, A & B in N and x in SIGMA
    or x = empty_stg.  Clearly these guys are a subset of CFG's

    Now, the point of this more or less arbitrary classification is that
    the last 3 are recognizable by 3 different finite automatons.
    Regular grammars are recognizable by Finite State Automatons, context-free
    grammars are recognized by Push Down automatons (an FSA with an arbitrarily
    large stack "tacked on") and context-sensitive grammars are recognizable
    by none other than our ol' friend the Turing Machine (here restricted
    to a tape that can't grow larger than the size of the input string).

    And if this were not enough, grammars can be ambiguous.  English is
    full of this property and your example "This pen leaks" is a good one
    (the usual one being "Time flies like an arrow").  The real interesting
    thing here is that the difficulty lies in the semantics of the language
    not in the syntax.  "Semantical context" is all important in resolving
    ambiguities and it involves *understanding* the given sentences (whatever
    that may mean).  To my mind, it is this stuff that has to be figured
    out in a rigorous manner B4 the computer/natural language problem can
    make SIGNIFICANT headway.  (After all, "I ain't got no money" is hardly
    grammatical, but is certainly recognizable and understandable!)

>   What we really need is a language we can prove things about.

    AND STILL SUFFICIENTLY POWERFUL ENOUGH to express your problems in.
    Yes, this would be nice, but program verification is a "rat-hole" for
    another time.


    /Jon
87.83Nuke implicit variablesCSTVAX::MCLUREVaxnote your way to ubiquityThu Jun 12 1986 13:1718
Re: .80

>>    I just got through spending three days debugging
>>    a Basic program which seemed to be eating data, it turned out that it
>>    was a misspelled implicit variable (ARRRGH)!

>Well, that's why they invented OPTION TYPE = EXPLICIT.  Using this statement
>would have reduced your debug time to three seconds...

	Unfortunately, I didn't have the option of using OPTION TYPE = EXPLICIT
    due to a somewhat stifling manager who want's me to code "his way".  If
    I had my way, I wouldn't have used basic to begin with.  If it wasn't bad
    enough that the typo existed (in someone else's map file), what made
    matters worse was the fact that the variable names used were so cryptic
    (SPRI$ should have been SPRIM$).

						-DAV0

87.84What can I say?METOO::LAMIAThu Jun 12 1986 18:127
>Unfortunately, I didn't have the option of using OPTION TYPE = EXPLICIT
>due to a somewhat stifling manager who want's me to code "his way"....
    
    I'm going to resist making the OBVIOUS reply.........
    
    

87.85Programming Languages are Turing MachinesLATOUR::RMEYERSRandy MeyersThu Jun 12 1986 21:1839
Re .82:

Every programming language that I know about is equivalent to a Turing
Machine (subject only to memory constraints).  The point is not that
the programming language itself can be recognized by a context free
grammar, but that the problems that can be solved using the programming
language are the class of problems that can be solved by a Turing
Machine.

This, by the way, is the "Turing Tar Pit."  The pit is that every
programming language can solve any problem, but many programming
languages produce ugly, slow, unmaintainable solutions.  For example,
a language that lacks floating point math is a poor choice for writing
a statistical analysis system in, but that doesn't mean that you cannot
use integer math (maintaining mantissas in one set of variables and
exponents in another) to write a routine that does curve fitting.

If you would like a proof (left to the reader :-)) that any programming
language can solve the problems of a Turing Machine, look up the definition
of a Turing Machine in any good textbook, and write a program to simulate it.
The only problem you will have is that the array that implements the
storage tape isn't unbounded in length.

I first saw the phrase "Turing Tar Pit" in Paul N. Hilfinger's dissertation
(it won the ACM Doctoral Dissertation Award one year).  (Hilfinger was
a member of the Ada design team.)  I don't remember who Hilfinger credits
with originating the phrase.

Many people seem to find that statement that all programming languages are
equally powerful pretty bizarre.  They immediately raise some objection
like, "Well, in Bliss I can target variables to particular registers.  Let
me see you do THAT in Basic."  Programming languages are equal from
an algorithmic point of view; that doesn't mean that some problem with
non-algorithmic constraints can be solved by every programming language.
Just as you can't write a subroutine in BASIC (in most BASICs anyway)
that gets its argument in register 7, doesn't mean that you couldn't
write a BASIC program that would solve any problem that seems to
necessitate writing a program that calls a subroutine that receives
an argument in R7.
87.86TM's ARE equivalent to Type 0 languagesTLE::FELDMANLSE, zealouslyThu Jun 12 1986 23:2021
    Re Automata and Grammars
    
    The class of languages that can be recognized by a Turing Machine is
    the class of languages described by Type 0 (unrestricted) grammars. 
    
    My recollection is that the automaton corresponding to Type 1 grammars
    (context sensitive) is the Stack Automaton, but since I don't have
    my copy of Hopcroft & Ullman handy, I'm not going to go out on a
    limb and make statements about things I don't know for sure.
    
    In the case of Type 2 grammars (context-free), you need to be precise;
    the corresponding automaton is the Non-Deterministic Push Down
    Automaton.  Deterministic PDA's are not as powerful as NPDA's.
    
    Chomsky's language papers were published in the late 50's.  The
    two citations I have handy are: "Three Models for the Description
    of Language," IRE Transactions on Information Theory, vol. 2, no.
    3, 1956; and "On Certain Formal Properties of Grammars," Information
    and Control, vol. 2, no. 2, 1959.
    
       Gary
87.87:-)PASTIS::MONAHANFri Jun 13 1986 05:203
    re : .82
    Time flies like an arrow.        But
    Fruit flies like a banana.
87.88RE: Halting problemRACHEL::BARABASHBill BarabashFri Jun 13 1986 10:0042
  RE: .82 (Halting problem):

>   Other Turing machine notes:  this business of provable is completely
>   dependent on the axiom system that you're working in.  The so called
>   "halting problem" is the Turing equivalent of decidability in First
>   Order Predicate Calculus.  And this is inextricably tied up with the
>   business of self referencing sentences of the language.  Of course this
>   leads into the fascinating subject of Model Theory and The Incompleteness
>   Theorem of FOPC.  But, this stuff is better placed in the mathematics
>   notes file....

  The proof of the unsolvability of the halting problem depends only on the
  fact that it is possible to assign a unique integer to each Turing machine
  (or to each valid program written in your favorite programming language).
  You can do this by simply ordering all strings first by length and then
  alphabetically, then throwing away those that are not valid programs.
  Therefore for any program in any programming language it is possible to
  use this procedure to get the program's number, and to recreate the text
  of any program given only its number.

  Now consider a program which returns one if it is given the number of a
  program which halts with its own number as input, or else zero if it is
  given the number of a program that loops infinitely given its own number
  as input.  It would be possible to write such a program if and only if
  the halting problem was decidable.

  Now call the above program from another program:

	Program contradiction (x) =
	Start: If halts(x)=1 then goto start
		else return 1.

  Now if you run this program with its own number as input, it will loop
  infinitely if the "halts" program says that it terminates on its own number,
  and will terminate if the "halts" program says that it will loop infinitely
  on its own number.  By contradiction, we conclude that it is not possible
  to write the "halts" program, therefore the halting problem is unsolvable.

  I see no dependence on any axiomatization here!  If you disagree with the
  conclusion, feel free to supply a "halts" procedure as counterexample :-)

  -- Bill B.
87.89Turing Tarpit attributionTLE::HOBBSFri Jun 13 1986 10:211
I believe the term "Turing Tarpit" is attributed to Alan J. Perlis.
87.90More on .82 (references)GALLO::AMARTINAlan H. MartinFri Jun 13 1986 18:53156
Re .86:

The name used for a recognizer for context-sensitive languages
in Aho & Ullman is "(two-way, nondeterministic) linear bounded automaton".
I haven't read Hopcroft & Ullman, so they could indeed use another name.

Re .82:

I also believe that some of the definitions you used do not agree
with those used in some traditional texts.

Quotes from Aho & Ullman's "The Theory of Parsing, Translation,
and Compiling; Volume I: Parsing" are noted as [AU].

Quotes from Gries's "Compiler Construction for Digital Computers"
are noted as [Gr].

Sorry for the redundancy with the previous replies, but I wanted to
get references for most of these things, and other people snuck in
before I pulled this all together
				/AHM/THX



.82>    Turing machines cannot recognize unrestricted grammars (see below).

[AU] 2.1.3 (Restricted Grammars), p 91:

"_G_ is said to be:

(1) _Right-linear_ if ...
(2) _Context-free_ if ...
(3) _Context-sensitive_ if ...

A grammar with no restrictions as above is called _unrestricted_."

p 92:

"The (unrestricted) grammars define exactly the recursively enumerable
sets."

[AU] Section 2.1.4 (Recognizers), p 96:

"(4) A language _L_ is recursively enumerable if and only if _L_ is defined
by a Turing machine."



.82>>As it happens, most languages can be shown to be equivalent to Turing
.82>>machines.
.82>
.82>Not true, most programming languages are described by Context Free grammars
.82>which are not as general as those recognized by a Turing machine.

You are not talking about the same thing.  Most programming languages
pretend to have their syntax described by one or more CFLs, which
invariably describe many illegal programs (programs missing declarations,
programs which are too big, etc.)  Also, most programming languages
pretend to have the infinite store of a Turing machine, though they don't
as they are inevitably implemented on finite-sized hardware.



.82>>  There are four types
.82>>  of language (defined by Chomsky in '36 or '56...
.82>
.82>   I believe that you have alot of this backwards.  First it was in 1965
.82>   that Chomsky conjured up his four classes . . .

[Gr] Section 2.10 (Survey of formal language theory and reference), p 46

"Chomsky(56) defined four basic classes of languages in terms of grammars,
which are 4-tuples (V, T, P, Z) where . . ."

[Gr] References, p 470:

"Chomsky, N. (56)  Three models for the description of language. IREE Trans.
Inform. Theory, vol IT2, (1956), 113-124."



.82>    Also, the classes refer to *grammars* NOT languages . . .

It is proper to use the classes to refer to either.

Section 2.1.3 (Restricted Grammars), p 92:

"Convention

If a language _L_ can be generated by a type _x_ grammar, then _L_ is
said to be a type _x_ language, for all the _"type x"'s_ that we have
defined, or shall define."



.82>Anyway the classes are
.82>divided into those that are *phrase structured* and those which aren't.
	-and-
.82>Unrestricted grammars.  These are the most general and are not phrase
.82>structured.

The term "phrase structure grammar" has several different definitions:

[AU] Section 2.1.2 (Grammars), p 84:

"In this section we will look at a class of grammars called Chomsky
grammars, or sometimes phrase structure grammars."

On the other hand:

[Gr] Section 2.10 (Survey of formal language theory and reference), p 46

"First of all, we say that G is a (Chomsky) _type 0_, or a _phrase-structure
grammar_ if . . ."

And finally:

Frank DeRemer has used the term "phrase-structure grammar" to denote a grammar
which describe the high-level syntax of a language, as opposed to a grammar
which describes the syntax of lexemes in the language (the "lexicon").
See section 5.1 (Self-Describing Grammars), p 53 of "Compiler Construction;
An Advanced Course", edited by Bauer and Eickel.



.82>The least general is right-linear (or regular), characterized by each
.82>production in P having the form A -> xB | x, A & B in N and x in SIGMA
.82>or x = empty_stg.  Clearly these guys are a subset of CFG's

Not everyone defines "right-linear grammar" as restrictively as you, though
you are not alone:

[AU] Section 2.1.3 (Restricted Grammars), p 91:

"Definition

_G_ is said to be

(1) _Right-linear_ if each production in _P_ is of the form _A_ -> _xB_
or _A_ -> _x_, where _A_ and _B_ are in Nu and _x_ is in Sigma*."

So, the following grammar G is right-linear:

	G = ({S}, {f, o}, {(S,foo)}, S)



.82>Now, the point of this more or less arbitrary classification is that
.82>the last 3 are recognizable by 3 different finite automatons.
.82>Regular grammars are recognizable by Finite State Automatons, context-free
.82>grammars are recognized by Push Down automatons (an FSA with an arbitrarily
.82>large stack "tacked on") and context-sensitive grammars are recognizable
.82>by none other than our ol' friend the Turing Machine (here restricted
.82>to a tape that can't grow larger than the size of the input string).

In general, PDAs and LBAs (those restricted TMs) are not "finite automatons".
87.92Garbage, Occam and the Universal LanguageMUD::COOKNeilSat Jun 14 1986 01:2944
> From Note 87.73 "John Haxby, IPG/Ultrix, Reading"
>    Garbage collection.
>    Definitely, a "must have", makes programs go faster.  There's none
>    of that tedious mucking about freeing space (or forgetting to, or
>    freeing by accident) whenever you have finished with it, or think
>    you have.  In a multi-tasking environment the garbage collector
>    is run as a low priority process so you never even see it (unless
>    the system suddenly panics and needs a lot of space, but we aren't
>    talking about real time operating systems here, and even if we were
>    I would still advocate a garbage collector).  You are less likely
>    to write programs that fall over if you aren't worrying about space
>    allocation.

    Please give an example of a non-trivial langauge which allows garbage
collection to be done on all dynamically allocated storage. The Ada notes
file gives a discussion of why we will never see a full Ada garbage
collector.  Pascal and C can not even go as far as VAX Ada does in
deallocating dynamic storage.

    Garbage collection is a *good idea*, but little more. Certainly you
don't want to throw all that good storage away, and worrying about it
distracts you from your real problems. You need a langauge which allows
garbage collection without giving up any of the power you really need
from the langauage. Has anyone a candidate for this position?

    In my experience, only the programmer can really handle their data
structures correctly, and yes, that does imply wasted programer time.


    With regard to the comment about Occam lacking power, I would point
out that its name came from the ability of its designers to keep "features"
out of the language. Note also that a new version of Occam is almost
complete which is reputed to be more powerful. Or will it just have more
features?


    In conclusion, I think we can all agree that we need power in the
languages that we use. We seem to disagree on how to get this power.
Some people favour a complex language with many features, others favour
easily extensible but simple langauges (e.g. Forth). The only view I
cannot sympathise with is the one which says "Take every feature from
every language, and make it all accesible to the programmer" I think
that clarity and consistency are more important goals which are
diametrically opposed to the goal of a "Universal Language".
87.93Powerful vs Flexible (extensible)ENGINE::BUEHLERDon&#039;t mess with my planet.Sat Jun 14 1986 13:2920
  Take the case of TPU.  How many programmer-types have changed the appearance
of their editor 100% from, say, EDT?  [I'm sure EMACS falls into the extensible
category just as nicely.]  A programming language which provides all the
tools necessary for a programmer to build on that language would be ideal.
How many times have you developed software which ends up being used in a
way you never imagined?  The same can be said of a programming language.
The designers of programming languages almost universally establish the
domain/capabilities of their language, and then provide a few extensible
options to the language.  PASCAL provides records, user-defined data types,
etc, and functions can be written to make 'new' operations.  But the language
itself is not *that* extensible.

  The idea of 'letting nature take its course' is a good one for a period when
languages are still in their relative infancy (I'm sure we have a long way to
go.  We're still basically crossing the line from FORTRAN in production
languages.)  Let's let the programmers spend the god-knows-how-many hours
figuring out interesting things to do with languages, instead of having a
committee decide what's best (or whoever happens to have enough money to fund a
new language, like DOD.)  Of course, the programmers would have to have a fully
extensibly language in order to be at their maximum creative potential. 
87.94Halting Problem revistedCHOVAX::YOUNGChi-SquareSat Jun 14 1986 22:4479
	.re 88

>  Now call the above program from another program:
>
>	Program contradiction (x) =
>	Start: If halts(x)=1 then goto start
>		else return 1.
>
>  Now if you run this program with its own number as input, it will loop
>  infinitely if the "halts" program says that it terminates on its own number,
>  and will terminate if the "halts" program says that it will loop infinitely
>  on its own number.  By contradiction, we conclude that it is not possible
>  to write the "halts" program, therefore the halting problem is unsolvable.
>
>  I see no dependence on any axiomatization here!

	  Ah, but it is there nonetheless Bill.  Godel's theorem, and all
	its bizarre kin (and the Halting problem is one) all share the
	limitation that the conclusions reached (that you can't always
	reach a conclusion) are only valid within systems whose rules
	can map onto Godel's theorem's rule set (the First order Predicate
	Calculus).  As it happens almost all everyday complex processes
	and thoughts seem to, so it is often incorrectly stated that
	they apply to everything.  They don't.

	  But, you are saying, where are the axioms in the above example?
	They are the rules of the programming languages themselves, or
	in this case the psuedocode that you are using.  As was pointed
	out in an earlier note, almost all programming languages are
	similarily "powerful" in this respect (and it IS their power
	that causes this) but they do not have to be this way. A language
	with differnt rules might not have the same problems.

>  If you disagree with the
>  conclusion, feel free to supply a "halts" procedure as counterexample :-)

	  Fine, first I will construct a language (loosely) for this
	purpose:
	    Valid statements:
				Routine (...)	! for indicating the start
						  of the routine, and for
						  recieving arguments.
				Return (...)	! for retruning them
				Call X (...)	! for calling routines
				Halt		! for ending a routine
	    Rules:
				No recursive routine entry.

	  Notice that there are no goto's, loops, or conditionals.
	Now we can construct a routine that, given the enumerated input
	of another valid routine or program, will return 1 if that
	routine will halt, and a 0 if not.  Since all valid routines
	in this language MUST halt (there nothing else to do!) the
	routine is pretty simple:

		Routine (x)

		    Return (1)

		Halt


>  The proof of the unsolvability of the halting problem depends only on the
>  fact that it is possible to assign a unique integer to each Turing machine
>  (or to each valid program written in your favorite programming language).


	  As we can see, that is not the case.  Our new language can easily
	have all its valid routines mapped into the integers, but as just
	demonstrated, my favorite programming language (It's modular,
	readable, I never have errors, or need to look anything up.)
	can solve the Halting Problem.

	  Of course this language is not capable of very much, but there
	are more complex (and useful) ones that also meet this criterion.
	I will be happy to supply examples on demand.

    Barry
87.95hand wavingCIM::JONANYou can&#039;t get there from hereMon Jun 16 1986 10:4396
  Re: .85

> Every programming language that I know about is equivalent to a Turing
> Machine (subject only to memory constraints).  The point is not that
> the programming language itself can be recognized by a context free
> grammar, but that the problems that can be solved using the programming
> language are the class of problems that can be solved by a Turing
> Machine.

  Programming languages are not TM's, programs written in them are.  Also,
  the reocognition problem IS the point, since, as you seem to agree, virtually
  any programming language is powerful enough to express a TM equivalent
  solution.  The point is to conjure a language that allows an easy, clear,
  and flexible expression of a problem's solution (or better yet, that actually
  helps you discover a solution) and that is still recognizable by some
  effectively computable procedure (If you can't recognize (use) it, what
  good is it?? AND if you can recognize it, but it has poor expressivness
  what good is it??)


  Re: .86

> My recollection is that the automaton corresponding to Type 1 grammars
> (context sensitive) is the Stack Automaton

  Turing, but with the restriction that tape's length is no bigger than
  the input string's length.
      
> In the case of Type 2 grammars (context-free), you need to be precise;
> the corresponding automaton is the Non-Deterministic Push Down
> Automaton.  Deterministic PDA's are not as powerful as NPDA's.

  Yes, however, in general programming languages (at least those to
  date) can be expressed in the subset of CFG's that are recognized by
  DPDA's (good thing, lest parsers would have to backtrack all over the
  place, giving recognitioin time complexities far greater than O(n))

  Other points WELL taken.
    

Re: .87

> Fruit flies like a banana.

  I really like this one!  Probably the most engaging statement so far!


Re: .88


>  The proof of the unsolvability of the halting problem depends only on the
>  fact that it is possible to assign a unique integer to each Turing machine

   Nope.  See below.

>  You can do this by simply ordering all strings first by length and then
>  alphabetically, then throwing away those that are not valid programs.

   Though this is essentially correct, there is ALOT of "hand waving" going
   on here.  To someone who didn't already know the answer, a decision procedure
   for valid TM's might seem to require a solution to the halting problem:
   Procedure x identifies all and only the valid TM's, how do we know it
   will halt for all TM's fed to it (in particular itself)?


>  ... we conclude that it is not possible
>  to write the "halts" program, therefore the halting problem is unsolvable.

   Nope, just that the "halts" FUNCTION is not Turing computable!  Note:
   the function halt is well defined and thus is legitimate as a function,
   you just can't write the "halts" program to implement it (i.e. there
   is no TM that can compute halt).  To get to the unsolvability of the
   halting problem you have to accept (WITH OUT PROOF) the hypothesis that
   Turing computable is equivalent to our concept of computability in general.
   This hypothesis is generally refered to as Church's Thesis.
       Godel was the first to define what is equivalently Turing computability
   in his proof of the Incompleteness Theorems of first order predicate
   calculus (the definition involved what is now known as recursion and
   the functions so expressed were called "recursive functions".  For a
   long time after this people worried over whether this definition really
   captured the essence of general computability.  By the by numerous other
   definitions were proposed (one of which was Turing's) and all were shown
   to be equivalent to Godel's.  Hence, the evidence supports Church's Thesis
   and it is now taken (in most circles) as true.  (Hence, in this case
   there exist non computable functions!! e.g. halt)

>  I see no dependence on any axiomatization here!  If you disagree with the
>  conclusion, feel free to supply a "halts" procedure as counterexample :-)

   See above.  Also, the denumerability of TM's implicitly depends on the
   Axiom of Infinity of std set theory and....

   
   /Jon

PS: WHO started this anyway??
87.96RACHEL::BARABASHBill BarabashMon Jun 16 1986 10:4833
  RE: .94

  Since my reply was to a statement about Turing machines (please go back
  and re-read) I don't see any refutation to my proof by your example of
  a language that lacks the power to simulate a Turing machine in.  What
  you have proved is that there is a subset of Turing machines (or computer
  programs written a standard programming languages) for which halting is
  provable.  This is intuitively obvious -- anyone can write a small program
  that has no loops and be guaranteed that it halts.  For the more general
  case Turing's theorem holds.

  RE: proofs of program correctness

  There are some people who say that computer programs should not be written
  in languages wherein you cannot prove their correctness.  Some even go
  further and say that you should not even run programs that haven't been
  proven correct.

  Let me tell you why I do not abide by this view.  First, the correctness
  proof is more difficult to write than the program, so the proof itself is
  more likely to contain a bug.  Second, the degree of difficulty of proving
  a program correct is such that only for trivially small programs such as
  "Eight Queens" and "Towers of Hanoi" are such correctness proofs practical.
  Real software (pick any of DEC's software products) is complicated far
  beyond anyone's ability to produce a correctness proof.

  You may object on the ground that, even if human programmers lack the
  prowess to prove their programs correct, surely some great genius could
  write a general-purpose computer program to provide the proofs.  If that
  were the case, the unsolvability of the halting problem would become an
  insurmountable hurdle.

  -- Bill B.
87.97RACHEL::BARABASHBill BarabashMon Jun 16 1986 10:537
  RE: .95 (hand waving)

  It is quite simple to recognize valid Turing machines or programs written
  in Turing-equivalent programming languages -- compiler writers do it for
  a living.  I stand by my proof.

  -- Bill B.
87.98I say, WHO started this??CIM::JONANYou can&#039;t get there from hereMon Jun 16 1986 11:5025
    Re: .97
    
    Nothing in your "proof" addresses the task of recognizing valid
    TM's!  Who disagrees with the proposition that you can recognize
    them???  Standing by your "proof" just means that you are willing
    to accept Church's Thesis (So am I, six days a week.  Oh heck, it
    probably is true (this must be one of those days...)). :-)
    
    Re: .90
    
    Interesting, thanks for the info.  Oh, 
    
    > In general, PDAs and LBAs (those restricted TMs) are not "finite
    > automatons
    
    Yes, but they are finite STATE automatons!
    
    Re: .94
    
    I'm not completely sure, but I think that the "halting" problem
    has always refered to general computability and thus (it's one of
    those days!) Turing computability.
    
    /Jon
87.99answers to 82etc, 92VOGON::HAXBYJohn Haxby, IPG/Ultrix, ReadingMon Jun 16 1986 11:5299
    Re 82 et al.
    That'll teach me to sound off about Turing Machines and Chomsky
    without checking up on my books first.  I have an excuse, I haven't
    brought my books here yet because I haven't been at Digital long
    and I haven't got anywhere to put them yet.  However, you were right,
    mostly, all of you and thanks for reminding me about things I had
    more-or-less forgotten.
        
    
    Re. 92
>    Please give an example of a non-trivial langauge which allows garbage
>collection to be done on all dynamically allocated storage. The Ada notes
>file gives a discussion of why we will never see a full Ada garbage
>collector.  Pascal and C can not even go as far as VAX Ada does in
>deallocating dynamic storage.
    
    I'm disappointed.  I would've thought people knew about garbage
    collecting languages other than lisp and prolog (et al).  The two
    'general purpose' languages that do garbage collection that I can
    think of right now are CLU (from Barbara Liskov at MIT) and Algol68,
    preferably the most recent version, Algol68RS (nothing whatever to do with
    Algol60 which is like Fortran but with 'procedure', a mouthful prone
    to typos if ever I saw one).
    
    CLU, short for 'clusters', is really an object oriented language.
    It has a few funnies, at first sight, that is no 'global' declarations
    and no pointers.  I thought it was strange when I first came across
    it: now I like it lots and realise that the language has a greater
    degree of expressiveness while retaining usefullness on Von Neuman
    machines than anything I have come accross to date.  Also, being
    a real high level language with a garbage collector, the compiler
    can and does generate bloody fast code.
    
    Algol68 never caught on because so few people can understand the
    definition of the language.  It is a two-level grammar defining a
    type 1 language: you can't process two-level grammars with automata.
    In fact you can simplify things a lot if you restrict the grammar
    a bit, but that's another story which I am not going to go into
    because I don't have the report here and I can't remember it that
    well.  It's also British which is usually a kiss of death for anything
    any good -- well, I'm biased.
    
    
>    	Garbage collection is a *good idea*, but little more.
    It is a lot more.  Worrying about space allocation/deallocation
    is a major pain and can easily take up half the time taken to get
    a project working.
    
>   In my experience, only the programmer can really handle their data
>structures correctly, and yes, that does imply wasted programer time.
    In my experience, only the programmer can really screw up his data
    structure management.  And does.  Frequently.  You try keeping track
    of trees and linked lists of trees and forests and symbol tables
    without losing something vital or having your available storage
    nibbled away because some routine allocates 30 bytes and can't ever
    free it up and the thing that can doesn't know it can.  There's
    a version of tar which takes some 24hours to extract a couple of
    thousand files from a tape (under VMS) because it grows very slowly
    and eventually starts thrashing VM.  And then there's that DEUNA
    driver that means the machine has to be re-booted once a week because
    it has eaten all the real memory and benchmarks run a few hundred
    times slower than they used to.  And the bugs in the IP code under
    BSD4.2 that used to crash the machine once a fortnight with "panic:
    MGET" which meant that there was no more core to allocate for buffers.
    Need I go on?  Finally, that program on the PDP11/70 which allocated
    space with no intention of ever freeing it (it 'knew' it would never
    need more than 64k) which ran some seven (7) times faster than the
    same program which carefully freed up store when it had finished
    with it, instead of letting the operating system free up the whole
    lot in one swell foop.
    
    
    Re: Occam's power (also .92)
    I know occam is powerful.  Lisp is. Ditto prolog and macro assemblers.
    I just wouldn't want to write major programs in them.  Occam is
    a wonderful language, but one of the original intentions (so I am
    told) was to use it as an intermediate language.
    
    
    Re: features (.92 and generally)
    PL/1 has lots of features.  It has very little else.  "Features"
    don't make a language powerful.  Mind-blowing concepts make a language
    powerful; things like data/program equivalence (in lisp); prolog-like
    rule evaluation; clusters (as CLU); mutable/immuatable objects (also
    CLU); modularity (Algol68, but its old hat now, though most languages
    screw it up one way or another); extensibility (only Satan does
    this properly, and I don't understand it).
    
    
    Re: the process termination thing in Ada
    I mis-quoted this.  The problem is an insecurtity problem which
    can result in all ancestors back to what amounts to the system call
    level.  It was discovered on IBM machines which resulted in unexpeted
    demise back to the JCL.  It arises from a class of exceptions which
    cannot be caught or ignored and result in the death of child and
    parent tasks.  I'd look it up but I don't have an Ada reference
    manual here...
    
    							jch
87.100CIM::JONANYou can&#039;t get there from hereMon Jun 16 1986 12:029
    Re: .96  (Program verification)
    
    Having done some program (actually subroutine; SMALL subroutine)
    proofs, I am VERY much inclined to agree with you.  See, I'm not
    hard to get along with!
    
    :-) /Jon
    
    PS: This should have been in .98, but I hadn't read .96 yet. Sorry.
87.101correctness provingVOGON::HAXBYJohn Haxby, IPG/Ultrix, ReadingMon Jun 16 1986 12:048
    Re. .96 and proof of correctness
    
    The desirability of correctness proofs comes from the Software
    Engineering disciplines... the point being that huge programs are
    enormously difficult to validate and one should really try to
    prove them by machine.  Anyway, you only have to write the correctness
    prover once and apply it to itself...  (no its not a universal panacea
    but it helps)
87.102CFGs do not suffice for real definitionsLATOUR::AMARTINAlan H. MartinMon Jun 16 1986 12:2019
Re .95:

>Yes, however, in general programming languages (at least those to
>date) can be expressed in the subset of CFG's that are recognized by
>DPDA's (good thing, lest parsers would have to backtrack all over the
>place, giving recognitioin time complexities far greater than O(n))

And all of those same CFGs also express many blatantly illegal programs.
It takes a thick pile of natural language statements to weed out the
trash.  Try writing a CFG for a toy language which requires variables
to be declared before their first use.  Or try writing a parser for such
a language which will catche those errors, and whose worst-case time
complexity is linear in the length of the input program.

Algol-68 is example that shows that those kinds of errors can be called
syntactic.  Its definition leans heavily on a method which is equivalent to
a context-sensitive grammar.  And I am not sure whether even that suffices
to disallow all the illegal programs without depending on supplementary text.
				/AHM
87.103CIM::JONANYou can&#039;t get there from hereMon Jun 16 1986 12:356
    
    Re: .102
    
    Point WELL taken. :-)
    
    /Jon
87.104The Halting Problem Solved?LATOUR::RMEYERSRandy MeyersMon Jun 16 1986 19:2336
Re .88:

>  I see no dependence on any axiomatization here!  If you disagree with the
>  conclusion, feel free to supply a "halts" procedure as counterexample :-)


A counter example from Al Aho and Jeff Ullman given in "The Theory of Parsing,
Translation, and Compiling", Volume 1, Page 36.

Exercise 0.4.21:

Let P[1], P[2], ... be an enumeration of procedures in some formalism.  Define
a new enumeration P'[1], P'[2], ... as follows:

   (1) Let P'[2i-1] be the ith of P[1], P[2], ... which is not an algorithm.
   (2) Let P'[2i] be the ith of P[1], P[2], ... which is an algorithm.

Then there is a simple algorithm to determine, given j, whether P'[j]
is an algorithm--just see whether j is even or odd.  Moreover, each of
P[1], P[2], ... is P'[j] for some j.  How do you reconcile the existence of
this one-to-one correspondence between integers and procedures with
the claims of Example 0.19.

[Example 0.19 is the same proof of the halting problem given in
note .88.  The of this reply should note that Aho and Ullman use
the terminology "procedure" for a Turing Machine which may or may
not halt, and "algorithm" for a Turing Machine (or procedure) which
halts.]

The Aho approved answer to the exercise is that it does indeed
produce a notational system in which is it possible to determine if
a Turing Machine halts.  However, it is possible to prove, that there
is no mapping from this alternate notation system to the standard
notational system that we deal with every day.  In other words, a
Turning Machine can not compute from the P' notation a formulation
of a program that we can deal with.
87.105Programming Languages are Turing MachinesLATOUR::RMEYERSRandy MeyersMon Jun 16 1986 20:3843
Re .85:

> Programming languages are not TM's, programs written in them are.

Yes and no.  Some of the programs written in a programming language
are Turning Machines (for example, a Turing Machine simulator).  In
general, however, any single program written in a programming language
is not a Turing Machine (for example, a program to compute a square
root).  It is true that the set of all programs in a programming
language define a Turing Machine.  It is also true that the set of
all programs in a programming language define the programming
language (this is the basic definition of a language given in
most formal language books).  The programs are the language.

Someone may object that there is a great deal of algorithmic fat in
programming languages, that a programming language contains many
programs unnecessary for the programming language to be a Turing
Machine.  That is true.  That does not detract from the programming
language being a Turning Machine.

Care should be taken in specifying "minimum Turing Machines."  If you
only keep those features necessary for programming language to be
a Turing Machine, you are left with the Turing's bare definition
of a Turing Machine.  From the algorithmic point of view, all
programming languages that are equivalent to a Turing Machine are
equivalent.  This is the Turing Tar Pit.  Thus, APL is equivalent
to ALGOL. (Sorry Mr. Blickstein, but you can console yourself with
the fact that style is everything!)

Even with all of that aside, the distinction that you make, that the
interpreter is different from that which is interpreted, is not made
in automata theory.  For example, the proof that a single Turning
Machine is equivalent a multi-tape tape Turing Machine is based on the
fact that a single tape Turing Machine can simulate a multi-tape
machine.  (A multi-tape Turing Machine is an example of a Turing Machine
with "algorithmic fat," by the way.)

This brings up an interesting point.  The languages groups are constantly
producing programs that are equivalent Turing Machines.  The VMS group
(except for DCL?) isn't working on a Turing Machine.  Since everyone
agrees that Turing Machines are better than more limited automata,
shouldn't the programming languages groups be much better funded than
the operating system group?  Maybe Keating should bring this point up.
87.106More garbageCADSYS::COOKNeilTue Jun 17 1986 03:2830
> Re 87.99 by John Haxby

CLU:
    If CLU has no global declarations, then I would guess that it
"garbage collects" variables when they go out of scope. Does its
garbage collector do more than that?

    Since it has no pointers, I suspect that programming applications
with "persistant" multi-threaded data structures would require very
different approaches to those in languages such as C, Ada and Pascal.

Algol68:
    I certainly know of Algol68, but after 5 years, I don't know much.

Garbage collection:
    In your original note I think you sugguested that garbage collection
might be done by a separate process at a lower priority. Of course,
for many machine architechtures this would be an impossibility due to
the logical separation of process address spaces. VMS and UNIX (I suspect)
suffer from this drawback normally(?).

    I think we both agree on the value of garbage collection. My
problem is in the lack of real, practical implementations. I suspect
Algol68 is a lost cause. I will look into CLU.

    Frankly, I am suprised that space allocation/deallocation is such
a major concern to you. Perhaps we have experience of different problem
domains.

    *GET_VM* Personally, my problems come here. *FREE_VM*
87.107don't drop litterVOGON::HAXBYJohn Haxby, IPG/Ultrix, ReadingTue Jun 17 1986 09:44108
  Re. 87.106 by Neil Cook
    
>CLU:
>    If CLU has no global declarations, then I would guess that it
>"garbage collects" variables when they go out of scope. Does its
>garbage collector do more than that?
    
    Wot scope?  The 'normal' scoping rules don't really apply.  Sure,
    you get things which are equiavlent to local variables (strictly,
    mutable objects) which go away when a procedure returns.  However,
    you do get a sort of global entity which you declare inside a cluster
    and declaring new instances of that object effectively declares
    space for that object.  Only is doesn't.  What you have to realise
    that CLU has only two things: mutable and immutable objects.  Once
    you understand this then you will understand the nature of the language
    and, honest, its not that complex until you try to map it on to
    conventional terms like 'scope', 'global' and 'pointers'.
    
>    Since it has no pointers, I suspect that programming applications
>with "persistant" multi-threaded data structures would require very
>different approaches to those in languages such as C, Ada and Pascal.
    
    You're right, it does, which upset me for a while.  As far as I
    am concerned only one language implements 'pointers' correctly,
    Algol68, and it calls them references which make much more sense.
    In CLU, forget pointers, the compiler will look after that kind
    of thing for itself.  Just write it down as you think it should
    be and it will probably work.  The only thing that doesn't is the
    'triple-ref' (as I was taught it) list walker which results in a
    statement like "tracer = &(*tracer)->next" in C (you can't do it
    in pascal either).  CLU uses other methods; for instance instead
    of testing explicitly for the end of the list you would do something
    like:
    		tracer := tracer.next except
    				when bounds tracer.next := newelem end
    
    or something like that (I still haven't got my books here) and I
    haven't got the CLU system up and running either.  No unix machine,
    life is hard.
    
    
>Garbage collection:
>    In your original note I think you sugguested that garbage collection
>might be done by a separate process at a lower priority. Of course,
>for many machine architechtures this would be an impossibility due to
>the logical separation of process address spaces. VMS and UNIX (I suspect)
>suffer from this drawback normally(?).
    
    For separate process read, maybe, co-routine.  I was in fact thinking
    of kernels which support garbage collection, that is run-time kernel
    for language systems.  For a large system based on a garbage collecting
    language, eg Unix written in CLU you would have a garbage collector
    process like the scheduler and the page daemon which would run at
    low priority and be scheduled by the scheduler when the system had
    little else to do.  (Under the Unix scheduling system it would get
    a look in every so often anyway.  In practice it would get a look
    in if nothing else could run because there was no space.)  Under
    VMS (written in CLU) you would have a kernel context process, ie
    something which has the relevent priveledge.
    
>	... Algol68 is a lost cause.
    
    I hate to say it, but I have to agreee with you.  There weren't
    enough people who pushed it hard enough.  Its ideas were way ahead
    of anything else in '68, or even '76 (when the revised report came
    out), but alas it has been dated by more powerful programming concepts,
    although many languages can benefit from many of its constructs.
    C has steadily been moving towards Algol68 since its inception,
    and C++ contains many more features and a re-hash (called classes)
    of something that existed in 68 because it was so general.
    
>    Frankly, I am suprised that space allocation/deallocation is such
>a major concern to you. Perhaps we have experience of different problem
>domains.
    Many of the things I do don't require storage handling at all, but
    when I do need it I tend to use lots of it and always in a different
    way.  With VM I am not so bothered about losing space in, say,
    compilers,  because they don't run for so long.  With a system that
    is going to run for a long time (eg an editor) I spend a lot of
    time on the storage management and usually end up with some compromise
    over what I would really like.  By the way, if you come accross
    an uptodate CLU system on the net somewhere, I would like to get
    hold of it...
    
    
    Re 102, Algol68 definition.
    The (Revised) Report does indeed define as syntax what many programs
    call 'semantics'.  In fact it defines as syntax all the things which
    are usually called 'static semtantics', things like making sure
    a variable is declared somewhere and that it is in scope at this
    point.  Also, it makes sure that mode declarations (typedefs to
    C plebs) are legal, there is a set of rules somewhere (7.1.4? I
    forget) which use Yin and Yang to check that recursively defined
    modes can occupy finite storage.  There are also rules that make
    sure that unions of types, no matter how complex, do not yield modes
    whose instantiation can be made ambiguous.
    
    There is a book called "introduction to formal language definition"
    (or something) and I'm afraid I can't remember the author at the
    moment, but in it he defines a language called "eva" in several
    different ways.  Two of the defintions are as two-level grammars,
    one defines the just the static semantics of the language, the other
    (which is about five times longer) defines the dynamic semantics
    as well -- including things like rules of arithmetic.  Honest. 
    If anyone is interested I will dig out the book and mail them copies
    (hard only, I don't fancy typing it in).
    
    								jch
87.108CIM::JONANYou can&#039;t get there from hereTue Jun 17 1986 14:0854
  Re: .104

  Yes, there are many non-Turing computable functions, including those that
  answer whether or not any given Turing machine halts on a given input.
  In fact there are non-denumerably many!!  This fact is a quick corollary
  to the propositions 1) There are only as many TM's as there are counting
  numbers (ie, they're denumerable) and 2) the size of the set of functions
  f:N -> N, where N is the set of counting numbers, is the same size as
  the set of all subsets of N (= P(N) = power set of N) which is equinumerous
  with the set of real numbers.  As Cantor showed (in the FIRST use of what
  came to be known as "diagonal arguments") the set of reals, R, is strictly
  "larger" in size than N (in fact, there are as many more reals in R than
  counting numbers in N as there are reals altogether!!).

  Re: .105

  Hmmmmm.  Turing machines are algorithms which can be described by many
  different methods: state-action tables, sets of quadruples (usual first
  step in defining an enumeration based on strings of symbols), state
  transition diagrams, and others (including programs in your favorite
  programming language).  So, any given program will define some TM and
  vice-versa.  Perhaps you are thinking of the universal machine concept.
  The universal machine is nothing more than another TM which is capable
  of carrying out the actions of any TM given any input string (a kind of
  TM interpreter).  It is very useful in showing the undecidability of
  FOPC.  The definition that a formal language IS a set of expressions
  (programs if you will) doesn't seem to help much.  I agree that this is
  certainly one definition but, except for small cases, it is very hard to
  work with.  To be useful, an extensional definition is needed, wherein
  you define a language by specifying  sets of symbols (that will be used
  to refer to objects, relations, operations, etc.) and some set of rules by
  which the symbols may be combined to create well formed expressions
  (say programs).  Such a definition will clearly define a unique set
  of expressions (programs) and certainly any given set can be given
  such a definition.
    Now, we can interpret these languages in most any way we chose (ie. place
  a semantics on them) so that the expressions actually say something about
  the objects we are concerned with (the "domain of discourse").  One such
  interpretation will let us state that the expressions define algorithms,
  i.e., TM's (Yes, there is some hand-waving here, but I'm not giving a
  proof and the details are too complicated.  Scope out a book on model
  theory if you're really interested).  At this point we can legitimately
  call the expressions programs.  
    The point:  there are infinitely many of these language definitions
  whose expression sets (the acutual languages in your definition), under
  the appropriate interpretation, define the same set: the set of TM's.
  Certainly, there are among these, those whose extensional definitions
  are simpler, more elegant, clearer - in a word better - than the rest.
  This is all a bit abstract, but I think it points out that here
  style IS everything!  (I don't think 'style' is the right word here,
  but it was a good tie in...)

  /Jon
87.109CLT::GILBERTJuggler of NoterdomTue Jun 17 1986 16:583
Alright, already!  I'm convinced!  You've convinced me!
VAX Notes really should have some way to mark all future replies to
a topic as 'seen'.
87.110now, for something completely differentDRFIX::RAUHALAKenTue Jun 17 1986 22:493
Yes, but you never know what a topic is going to evolve into, (at least
one with lots of replies like this one) and you might miss some useful
info in the future, like, Turing Machines.
87.111Different, yes. Useful?MLOKAI::MACKa(2bThu Jun 19 1986 16:238
Re. .110:
    
    I fail to see what Turing Machines, ALGOL-68, or proving program
    correctness have to do with "useful info"...  
    
    					Pragmatically,

    					   -Ralph
87.112RelativityVOGON::HAXBYJohn Haxby, IPG/Ultrix, ReadingFri Jun 20 1986 07:1316
    Re .-1.
    
    I thought I had written .110, obviously I hadn't.  To those people
    who care about the nature of a language Turing Machines, provability
    (and Algol68?) matter.  If you don't keep language theory firmly
    in mind you wind up with something like the BASIC clones or PL/1
    or CWL3 (I'd tell you about CWL3, it'd make you laugh, or cry, but
    its proprietry.  I'll reveal the juicy bits over a drink though...)
    
    "Useful" languages, like CLU or '68 have a strong theoretical
    background and they behave themselves.  Languages without a strong
    theoretical background don't -- things like C, which just growed.
    Not all carefully developed languages are any good ... take Ada
    for example ...
    
    						jch
87.113I confessCSTVAX::MCLUREVaxnote your way to ubiquityMon Jun 23 1986 14:1815
	Ok, the Tangential Express seems to have returned to earth for
    refueling, so what's the scoop?  Is there any consensus yet on what
    sort of data abstraction we can use to base this discussion on, or
    do we pack-up our keyboards and go home?  I (somewhat sheepishly)
    admit to bringing-up the Turing Machine and Church's theory in an
    attempt to find a common level of data abstraction that we can use
    here to form a base layer of what could evolve into a "Universal
    Language" definition.  I had no idea it would stir-up such a response
    (I wish I could stir-up responses like these more often).

	If anyone is interested in working towards the above goal and
    thinks that maybe this isn't the appropriate note for it, let me
    know.  Otherwise, who has some suggestions on how to proceed?

						-DAV0