T.R | Title | User | Personal Name | Date | Lines |
---|
190.1 | C not expression language | COMICS::DEMORGAN | Richard De Morgan, UK CSC/CS | Fri Jun 24 1988 10:06 | 7 |
| There seems to be some confusion here. Are you saying that C is an
expression language? (line 6 of .0) If so, you are wrong - at least in
the sense that Algol 68 and Bliss are expression languages. Some
languages take things even further: S3 for example, can deliver
a value from a CASE clause - useful for table searching. C does not
even provide a conditional expression (a la Algol 60)! - correct me if
I am wrong, but I cannot see this anywhere in the C syntax.
|
190.2 | Definitions, mlisp; keywords in C - hah! | DENTON::AMARTIN | Alan H. Martin | Fri Jun 24 1988 11:55 | 14 |
| Re .0:
You should define what you mean by "procedural language" and "functional
language" before attempting to draw conclusions about them.
You should also find out about meta LISP (mlisp), which was a preprocessor
which translated an Algol-style syntax into LISP. Can anyone provide any
references for it?
Re .1:
Your vision has been clouded by the expectation of an obvious syntax for
conditional expressions in C.
/AHM
|
190.3 | mlisp = M-expressions? | SAUTER::SAUTER | John Sauter | Fri Jun 24 1988 13:41 | 10 |
| re: .2
I believe mlisp was the "publishing notation" for Lisp. Mlisp was
written in M-expressions, whereas Lisp was written in S-expressions.
Translating M-expressions to S-expressions is pretty straightforward.
I haven't heard anything about M-expressions in a long time; I assumed
the concept of using a different language for publication than for
coding had faded. Maybe not.
John Sauter
|
190.4 | Not just for publishing | DENTON::AMARTIN | Alan H. Martin | Fri Jun 24 1988 15:48 | 6 |
| Re .3:
I've seen documentation for a translator for it (probably for MacLISP) so it is
more than an abstract notation - people could execute real programs written in
it.
/AHM
|
190.5 | | DSSDEV::JACK | Marty Jack | Fri Jun 24 1988 16:54 | 3 |
| C provides a very limited conditional expression:
a ? b : c
returns b if a is nonzero, c if a is zero.
|
190.6 | | TOKLAS::FELDMAN | PDS, our next success | Fri Jun 24 1988 19:22 | 8 |
| I've even used meta-lisp, a long, long time ago (early 70's). I
suspect it never caught on simply because it never achieved a critical
mass of users. Lisp is made livable (and lovable) because of the
wonderfully powerful environments that have grown around it. Without
people working on similar environments for meta-lisp, it couldn't
compete.
Gary
|
190.7 | just a beginner in this part of the field | MJG::GRIER | In search of a real name... | Sun Jun 26 1988 13:20 | 46 |
|
Re: .all:
I guess my point is that for some languages, there isn't much
difference. I would consider C an "expression" language, because of
its definition of most things being expressions. Granted, it DOES have
statements, "if", "switch", etc., but if I were to lump it into one or
the other, I'd call it an expression language because you can do things
like..
a += (b=(getc()=='\n'));
However, C is not what we'll call a "traditional functional
language" like LISP is. My point (which I realized somewhere while I
was writing .0) is that there's a lot of overlap. LISP while claiming
to be "functional" also provides procedural constructs, and languages
like Pascal while purporting to be procedural or statement languages
can provide functional constructs.
Re : PSW I believe, definitions:
I'd call a functional language one in which the execution of
meta-instructions is performed through the evaluation of an expression,
while a procedural or statement language is one where the execution of
meta-instructions is performed in a sequential manner as described by
the program.
These definitions do not prevent statement-extensions to
functional languages (the prog-operator in LISP), and do not prevent
statement languages which can be functional by making each defined
function be a sequence of a single operation.
Basically, I'm just learning about this subject, and I wanted some
people's thoughts and opinions. I was hoping to provide some fresh
insight from a person who hasn't been trained in the current technology
to the point that my thoughts follow set guidelines. However, with my
lack of experience in language design, most of my thoughts are a little
too wild and won't work, so I was looking for some ideas from people
who are "in the know."
Thanks for the input so far!
-mjg
|
190.8 | Here, have some definitions :-) | TLE::JONAN | Be a Realist - Demand the Impossible | Sun Jun 26 1988 14:50 | 69 |
|
Re: M-expressions
Once upon a time, LISP was to have a "typical" FORTRAN like syntax
(this was the original plan of McCarthy and Co.). This syntax became
known as M-expressions (meta) while the basic constructs (primitives)
were known as S-expressions (symbolic). What happened was that the
universal evaluator (the EVAL function) was most easily and cleanly
described in terms of S-expressions of which one obvious fall out
was the equating of data and functions (expressions, programs, etc.).
The EVAL function was then quickly "hand" implemented by one of
McCarthy's coworkers inorder to have a working (if preliminary)
version of the language.
Once this "cat was out of the bag" there was no turning back. The
ability to create and evaluate functions, etc. dynamically during
program execution became one of the more important aspects of LISP
- and the M-syntax faded into one of those "historic curiosities".
Re: .7
Definitions: Here's my cut...
Expression Language: A language which allows (though perhaps does
not require) any action (i.e., non-declarative) construct to yield
a value.
Statement Language: A language which attempts to draw a distinction
between two types of computational actions: those which (should)
produce values (ie, expressions) and those for which the action
itself is the desired result (statements).
Applicative Language: A language whose sole action construct is
the application of *functions* to operands (most importantly the
results of other such applications) to produce values/yield results.
(Yes, an applicative language is an expression language but the
converse is not true)
Imperative Language: A language by which computations are produced
by "tracking" the current state (as determined by a set(s) of variables)
and allowing for the *explicit* conditional changing of this state
(set(s) of state variables). [Imperative as in imperative
sentences...]
Procedural Language: A language in which the (primary) mode of expression
is the stating of the step by step sequence of actions to be performed
to achive some result.
Declarative Language: A language in which the (primary) mode of
expression is the stating of a set of facts/definitions/rules and
the required results/actions/values. [Basically what you know and
what you want]
I wouldn't say these are definitive [;^)] and the last seems the
least satisfactory (at least to me); but I believe they are pretty
much on target. More over, these certainly don't cover the gamut of
what's out there (OO, equational, logic, etc.)
Examples: LISP pretty much falls into the slot of a procedural
applicative language, APL a declarative applicative language,
BLISS a procedural imperative expression language. Of course,
there are always "features" which can throw monkey wrenches into
this...
/Jon
|
190.9 | Good answer, good answer! | MJG::GRIER | In search of a real name... | Sun Jun 26 1988 20:24 | 7 |
| Re: .8:
Thanks for the definitions! The "Dragon" book doesn't talk about
this stuff, that's for sure!
-mjg
|
190.10 | Is Designing a Compiler designing a Language? | ATLAST::WELTON | I broke up Springstein's Marriage | Mon Jun 27 1988 10:15 | 12 |
|
I have used both the "Dragon" books over the last few years. For
me they were an excellent source of techniques for designing a
compiler, but I wouldn't use them as a base for designing a language.
My advise: Check in to the Language "principles" books by Pratt and
the recent publication by MacLennon (the Beatles, Computer Scientist?).
Both of these authors do a lot of comparitive analysis of languages.
They also abstract out the features that are found in the major
"types" (expression/applicative/imperative...).
douglas
|
190.11 | Two fields of expertise | DENTON::AMARTIN | Alan H. Martin | Mon Jun 27 1988 16:10 | 6 |
| Re .10:
> -< Is Designing a Compiler designing a Language? >-
No, and far too many people in both worlds don't know the difference.
/AHM
|
190.12 | How true, I see! | MJG::GRIER | In search of a real name... | Mon Jun 27 1988 18:40 | 15 |
| re: .11:
>No, and far too many people in both worlds don't know the difference.
And I'm really starting to learn the differences! I think we need a
few more people spending time properly designing languages so that
the job of the compiler writer is more to implement a specification
(like in Ada) than to also design the language (case in point : C.)
Thanks for the information everyone! You've been helpful and
nobody's really hacked/slashed me for having mis-conceptions and some
weird ideas. Learning more all the time...
-mjg
|
190.13 | | TLE::BRETT | | Tue Jun 28 1988 13:01 | 5 |
|
I think we need a lot fewer people designing languages, and a lot
more working out how to use the (far too) many that we already have.
/Bevin
|
190.14 | | TLE::JONAN | Be a Realist - Demand the Impossible | Tue Jun 28 1988 19:59 | 28 |
| Re: compiler design /= language design
Also, as Hoar is fond of pointing out, language design is not the
same as *feature* invention (creation, design...). He sees the
task of the language designer as being that of creative and consistent
consolidation of already known features. Even though I think this is
somewhat radical, it's a useful distinction and worth keeping in mind.
Wirth advocates that language design and compiler construction (design?)
for the language, though separate things, should be done in parallel.
The idea being that some of what he (and others) take as important
aspects of languages have to do with things like: simplicity of syntax,
number of constructs, efficiency of resulting code and compilation, etc.,
and that these things are much more readily apparent at implementation
time. He even goes so far as to say that machine architecture (design)
should go hand-in-glove with this (ex: Modula-2 and the Lilith). Again,
though something to be taken with a grain of salt, good things to keep
in mind.
As for pleas for less language design and more ideas on use (software
engineering??), I guess I disagree with this. Indeed, many (if not
most) good ideas about how to design software have come about from
language (or least features...) design efforts.
/Jon
|
190.15 | but | MJG::GRIER | In search of a real name... | Tue Jun 28 1988 21:50 | 74 |
|
Re: less language design
I half agree, half disagree with you on this point.
I agree because with all the little research project going on, there
are dozens (maybe hundreds!) of languages which are constantly being
developed, implemented and used. If more coherence and unity of drive
could be applied to all the developers and designers, much wasted,
duplicated effort could be avoided.
I disagree very strongly, because I think that computer languages
have quite a way to evolve before anyone could feel comfortable saying,
"Now THIS is how you do it" in a voice which will last any length of
time.
After all, what's a computer language all about? Syntax? A nifty
efficient parser? Standardization? Partly all, but also partly none.
I see a computer language as being a means to direct a computer to
utilize an abstraction the "programmer" has conceptualized.
My latest personal thoughts are most lately away from ANY
"compilable" syntax. Who really cares if you write...
A[I] := 53.2;
or
a[i] = 53.2;
or
100 LET A(I) = 53.2
It's all taking a human concept of setting the Ith element of an
array of somethings to the value 53.2. (Of course the C example has
slightly different semantics beings a[i]=53.2; is completely equivalent
to *(a+i)=53.2;, and some would argue about which is INTENDED...)
I think the issues at hand are...
- Proper abstraction of a problem (OO paradigm...)
- Reusability (OO systems, Ada, MODULA-2 to a degree)
- Resultant code quality (BLISS, WRL's MODULA-2, and most
definitely NOT C)
- Environment conducive to high quality software (Ada, Pascal, once
again, NOT C)
Ada covers the 2nd and 4th aspects quite well, Pascal the 4th to a
degree, BLISS the last 3, and C, well, I don't see C code covering any
of those desirables.
BUT unfortunately, we're for the most part not ready for a
non-syntax oriented language. 4GLs while having a growing market are
still not the mainstay of the application developer. I've heard of
"flowchart languages" which give a strictly graphical view of the
program executing (COBOL Generator, for example,) but from what I've
heard and seen they seem pretty limited so far. So, what does that
leave us with? Standard edit-your-source-file-and-compile languages,
with more and more nifty integrated edit/compile environments (LSE, the
CASE program in general.) My giving in to this reality was one of the
things which spurred me to ask the original topic.
However, after reading this topic for a while, my attitude's changed
again. What's needed is a good, universal, high-quality, non-syntax
oriented way of specifying a program's execution. A way which allows
for all 4 of the requirements I listed above.
Ok, so I'm an idealist! :-)
-mjg (waiting for the DECwindows
NetEd widget...)
|
190.16 | Paradims, Paradims everywhere and not a drop to... | ATLAST::WELTON | Nancy Reagan told me to say it | Wed Jun 29 1988 15:36 | 72 |
|
Language design is like artwork: Should we stop new artist from
painting just because we haven't savored every picture? ( Think about
it this way: What if language design had been suspended after the
development of COBOL... Need I say more (VMS written in COBOL)).
Re: .15
> I think the issues at hand are...
> - Proper abstraction of a problem (OO paradigm...)
> - Reusability (OO systems, Ada, MODULA-2 to a degree)
> - Resultant code quality (BLISS, WRL's MODULA-2, and most
> definitely NOT C)
> - Environment conducive to high quality software (Ada,
> Pascal, once again, NOT C)
> Ada covers the 2nd and 4th aspects quite well, Pascal the 4th
> to a degree, BLISS the last 3, and C, well, I don't see C code
> covering any of those desirables.
I get the feeling you are not hot on C. I am not a fan either, but I
disagree with your "issues". None of them are language dependent.
They all are best handled in the programmer's head. LSE in lovely
environment for development and it handles C constructs just as easily
as Ada, however to the programmer enslaved to the religion of EDT, the
LSE evirons may be considered do-do. People jump on C and call it
terse (I used to do this all the time). But if you look at it from a
quality standpoint what C does is force the PROGRAMMER to do the
optimizing (something they can do a lot better non-deterministicly
than the heuristic hacks some "non-DEC" products provide).
> BUT unfortunately, we're for the most part not ready for a
> non-syntax oriented language. 4GLs while having a growing
> market are still not the mainstay of the application developer.
> I've heard of "flowchart languages" which give a strictly
> graphical view of the program executing (COBOL Generator, for
> example,) but from what I've heard and seen they seem pretty
> limited so far. So, what does that leave us with? Standard
> edit-your-source-file-and-compile languages, with more and more
> nifty integrated edit/compile environments (LSE, the CASE
> program in general.) My giving in to this reality was one of
> the things which spurred me to ask the original topic.
My first suggestion: Write a couple of APL programs. That will
definitely free your mind of the edit-compile-link-run-edit..
cycle.
What you express is a common idea among those in search of an
even HIGHER ( 5GL, etc... ) level language. The notion that
humans should do less in order to get the machine to do more (in
my most humble opinion, 4GL's are only a crude hack at the
notion). Perhaps you should investigate some of the
Non_Procedural languages (e.g., Prolog). (BTW, I'm not really sure
if there is such a thing as a non-syntax language. Syntax, by
definition, must always exist. Irregular syntax language? What do
you think? )
The paradim of Logic Programming is concerned with taking the
principle of Information Hiding to the max. The programmer simply
concentrates on What is to be done, and the compiler/interpreter
is designed to figure out how to do it. For example, the
programmer might want to sort a group of elements A, and place
them in another group B. It would then be the responsibilty of
the compiler to decide whether or to use Arrays or Tree or whether
to use a Bubble sort or Quickersort.
That's my gasoline for the fire...
douglas
|
190.17 | Silly computer, syntax is for humans! | MJG::GRIER | In search of a real name... | Wed Jun 29 1988 16:51 | 55 |
|
Re: .16:
Yup, I'm NOT hot on C. After several readings through the K&R book,
nearly -losing my dinner- on the first few chapters or so, I had vowed
not to learn the language. Then I took a compiler writing course
where we had to write a C compiler, in C. (Actually we took a barely
functioning minimal shell of a compiler and extend it. I ended up
re-writing pretty much everything except for main().) I feel that
after (1) picking up someone else's C code and working with it, and (2)
implementing a compiler for it, I'm qualified to speak hearily of my
dislike of C. It's not conducive to any of the elements I mentioned
(except for possible re-usablility, but then not any more than Pascal
in my opinion) and I'll stand behind that.
Re: Programmer optimization
I've been reading with great interest about relative merits of
programmer optimized code vs. compiler optimized code in another topic
in this conference. The consensus is that a well-written compiler with
an extensive optimization algorithm can produce code which is more
efficient than "programmer optimized". The way I see it, the
programmer more than has his job cut out producing code which correctly
implements the algorithm and is maintainable. Worrying about
specifying something like "a[j=++i] += 5;" rather than "i := i + 1; j
:= i; a[i] := a[i] + 5;" is a waste unless you're a really bad typist.
:-)
Re: Syntax:
I generally hate it when people quote a definition, but I'm going to
just have to hate myself a bit. From my DEC standard desk
dictionary...
syntax (sin'taks') n. The way in which words are put together to form
constructions, such as phrases and sentences.
My contention is that an algorithm is a structure if you will which
is without syntax, only semantics. Would you consider an intermediate
form for a language which is in a tree form to have syntax? No. Only
semantic content. Syntax is just for us humans in order to communicate
the ideas to other humans. Imagine if you will, a "universal"
intermediate representation which provides abstractions of most if not
all basic computer functions, which is directly modifiable by the user.
Obviously such a language is without syntax, although a means of
traversal and manipulation is needed (my comment about the NetEd
widget.) If you really wanted, the tree could be processed into a more
primitave language, but in general, why bother!
Re: last part (Logic Programming). Sounds just like what I'm trying to
think about. Any good sources of information on this?
-mjg
|
190.18 | Now my head hurts... | ATLAST::WELTON | Nancy Reagan told me to say it | Wed Jun 29 1988 18:45 | 19 |
| Logic programming sources:
"Proceedings of a Symposium on Very High Level Languages",
(SIGPLAN Notices 9, April 4 1974).
John Backus, "Can programming be Liberated the von Neuman Style?
A Functional Sytle and its Algebra of Programs", Communications
of the ACM, vol 21, no 8, august 1978, page 613-641.
Kowalski's "Algorithm = Logic + Control", Communications of the
ACM, vol 22., no. 7, July 1979, page 424-436
Clocksin and Mellish, Programming in Prolog, Second Edition published
by Springer-Verlag, 1984
You might also want to check in the Prolog notes conference for
more recent pointers and papers.
douglas
|
190.19 | Functions as objects in functional languages | TOKLAS::FELDMAN | PDS, our next success | Thu Jun 30 1988 12:01 | 28 |
| One of the properties of functional languages that hasn't been
mentioned is that functions are frequently treated as objects,
preferably as first-class objects.
This shows up clearly in the LISP mindset, by the prominence of
the MAP functions. Some procedural languages can achieve the same
effect, but not nearly as cleanly as LISP does.
More significantly, the ability to manipulate functions as objects
makes it possible to write a truly integrated editor or debugger
within the LISP environment.
APL has this property, but with a different flavor. The / and \
binary forms, for example, each take a binary operator and an APL
object as operands and produce a new value. Another way to look
at this is that / and \ take a binary operator and produce a unary
operator. This isn't as flexible as it could be, since the operators
are limited to built-in operators. The key is that part of programming
in APL is to build new functions by combining old functions.
Backus's FFP also has this property, though I consider it awkward.
Implementations of these languages usually involve interpreters,
which makes it easier to deal with these particular features. However,
it really is a matter of both the implementation and the language
semantics to make these features fly.
Gary
|
190.20 | Trees form languages and languages have syntax | DENTON::AMARTIN | Alan H. Martin | Fri Jul 01 1988 00:00 | 27 |
| Re .16:
>Need I say more (VMS written in COBOL)).
You mean it's not?
>But if you look at it from a
>quality standpoint what C does is force the PROGRAMMER to do the
>optimizing (something they can do a lot better non-deterministicly
>than the heuristic hacks some "non-DEC" products provide).
"non-deterministicly" is a good term for it. How many times have I heard "but
it worked on the other {hardware|operating system|version of the compiler|...}.
Re .17:
>Would you consider an intermediate form for a language which is in a tree form
>to have syntax? No.
Graham-Glanville code generators (among others) parse an intermediate language
in order to generate object code. The one I worked on used a tree-based
intermediate language. How did it do that if the tree had no syntax?
Languages are all around us. The Dragon Books cite J. F. Jarvis' paper which
describes using regular expressions to recognize imperfections in printed
circuit leads.
/AHM
|
190.21 | Ghosts in the data path... | PULMAN::MACK | Embrace No Contradictions | Mon Aug 01 1988 19:56 | 48 |
| I think I've identified the characteristic of applicative languages that
folks like myself brought up on FORGOL procedural languages find a real
stumbling block.
In FORGOL, every data object of any significance has a name, and we can
follow the lives of these data objects by observing their changes of
state. Things without names were fleeting, easily lost, and not to be
counted upon.
In LISP or APL, objects tend to be ephemeral, a nameless wisp of
bits somewhere on the stack. In college I got the impression that
novice LISP programmers use a lot of SETQ's but that the real elite
left such childish things behind and scorned those who still held
to the ancient religion that objects could be controlled only by
knowing their true name. :-)
I would consider this characteristic, a "major selling feature"
to some as an evidence of the elegance of the language, to be in
fact a weakness of the language. A computer language should be
a "lingua franca" between the human programmer and the silicon
computer. A program is equally a pattern for the computer to follow
and a document of that pattern to other programmers.
In an applicative program, beyond a very low level involving some
literals, the only identifying elements of a program are the function
names themselves. This means that function names and comments must
carry the entire detail documentation load of an entire software
system. I think this may have a great deal to do with the reputations
of LISP and (especially) APL as being cryptic.
Someone who has written a large software system in LISP might give
some useful input here as to the adequacy of function names alone
to document what they are doing. I suspect that it must be difficult
and require a *lot* of deliberate effort.
Personally, I like using large and perhaps inelegant languages like
Ada with lots of language constructs and opportunity for naming
things to make code that reads well without comments. (No I don't
like COBOL! COBOL makes it easy to tell what a statement does but
difficult to figure out what they do collectively. Ada gives both.)
Before I used Ada, I used PL/1.
My motto: Computers should work very hard so people don't have to.
(Now that I'm on a VAXstation, other users don't complain as much. :-))
Ralph
|
190.22 | | AITG::VANROGGEN | | Wed Aug 03 1988 01:16 | 101 |
| re: .21
A very interesting point--regarding objects and names.
Oddly enough, LISP is considered a "symbolic" language. This is
because of the ability to manipulate objects and give names to them.
Partly this ability stems from the fundamental notion that names
(known as SYMBOLs in LISP) are real objects in their own right.
They have characteristics and properties that one can easily
manipulate and extend.
As you point out, programmers used to "FORGOL" type languages will
frequently start out programming in the "bad" style which gives
names to everything. [BTW, this is a characteristic of LISP which
the other "AI languages" don't have--that one can program in one
or more of many different styles, and can mix them easily. Of
course having too many choices may be confusing for the novice.]
However, as one becomes accustomed to LISP's power and flexibility,
one tends to start thinking in terms of mapping the parts of the
problem to be solved to appropriate data structures more directly.
Most problems simply can't be solved with a predetermined set of
static data structures. So one can't have a meaningful name for
each of the objects in the system. Relational data bases don't
give names to each of the tuples, for example.
Furthermore, LISP provides the programmer with many built-in functions
that operate on many different data types (and many are generic
functions, too). Many of these functions handle functions as
data, such as the MAP function, which applies its functional argument
to each of the elements of the sequence(s) given as additional
arguments, and which optionally collects the results together into
a sequence.
Getting back to the immediate discussion, this treatment of functions
as real objects allows for several "naming" advantages:
(1) It's hard to think up names for each "temporary" or "internal"
function we might want to use.
(MAP 'NIL #'(LAMBDA (X) (PRINT (SQRT X))) SOMESEQUENCE)
concisely expresses the operation of printing the square roots of
the numbers in some sequence. Would you always want to be forced
to think up good names for those unimportant helper functions?
(2) It's good to avoid unnecessary names, to avoid the clutter of
too many names and to avoid too large code. The larger the code
is, the harder it is to understand it. The more you can fit on
one page (even with workstations to provide bigger pages), the
more you can understand at one time. And the higher level the
constructs you use along the way, the more abstract the code is,
and therefore the closer to the way you think about the problem
the code is. Which makes for easier programming and debugging.
(3) LISP also allows the easy and very powerful extension of the
language syntax with the use of macros and read-macros.
- Macros allow for easy extensions of the languages at the level
of the normal program structure, composed of lists.
(DEFMACRO CRITICAL-SECTION (&BODY body)
`(UNWIND-PROTECT
(PROGN (START-CRITICAL-SECTION)
. ,body)
(END-CRITICAL-SECTION)))
This defines a new, user-defined piece of language syntax which
allows one to wrap initialization and termination code around
any code. For example:
(LET ((GADGET (CRITICAL-SECTION (FETCH-A-GADGET)
(INCREMENT-GADGET-COUNT))))
...use the gadget without worrying that the fetch from a global
queue won't get "interrupted" by another task before it
increments some global counter...)
The details of this example are unimportant--the point is that
since LISP's syntax can be extended, it's possible to avoid
having a lot of more cumbersome mechanisms to get the same effect.
(The reason it's easy in LISP is because LISP code can be
manipulated as data, but that's another topic.)
- Reader macros change the syntax of individual characters, so
more concise printed representations of the code or data can
be used. For example, if you were working on some task that
required the frequent use of a call to some function, you
could define a read-macro character that would expand into
a call to that function with arguments that you could calculate
at the expansion time. In other languages you'd either have to
do a lot of typing, or you'd have to name the function something
very short and thus unmnemonic.
Also I'd like to point out that the kinds of development tools need
to be more general and more flexible than in a system that is static.
If one had defined an anonymous function in one module and had stored
the function in some data structure at run-time, it's useful to
be able to find the definition (source code) when examining the
contents of the structure in the debugger.
---Walter
|
190.23 | | AITG::VANROGGEN | | Wed Aug 03 1988 01:28 | 23 |
| Oh, and I forgot to mention the most obvious point--in Common Lisp,
one can attach documentation strings to names of functions and
variables. (The normal syntax makes this very natural.)
Then at any time one can ask for that documentation. The function's
name is DOCUMENTATION, btw.
There are a couple of other differences which make names in LISP
a little easier than in other languages:
(1) There are different namespaces (called PACKAGEs), to help
avoid conflicts such as UIS:WINDOW and X:WINDOW. Basically one
can substitute the VMS "$" convention with ":" to get an equivalent
namespace separation, but PACKAGEs are real objects in Common Lisp,
and can be manipulated/queried/reasoned about at any time.
(2) There are many fewer restrictions on legal characters acceptable
for names. (And one can easily avoid them, if desired.)
234T
COUNTER-INTUITIVE
$$$
are all acceptable SYMBOLs in the normal syntax (no read-macros
defined for "-" or "$").
|
190.24 | More on LISP, ending a bit off the subject... | PULMAN::MACK | Embrace No Contradictions | Tue Aug 09 1988 18:07 | 60 |
| I certainly would be the last to put down a language for having
powerful concepts like procedural objects and operations on procedures.
And Common Lisp does have a number of very attractive features I would love
to take advantage of. You have outlined several of them.
> ...as one becomes accustomed to LISP's power and flexibility,
> one tends to start thinking in terms of mapping the parts of the
> problem to be solved to appropriate data structures more directly.
This is what I find to be most dangerous -- non-verbal non-pictorial
thinking is non-transferable thinking. For code to be maintainable,
the code itself must communicate not only between the original
developer and the machine, but between the original developer and his
successors. Comments are seldom maintainable (particularly if they
aren't always in view when the code is in view -- out of sight, out of
mind).
I am willing to be convinced, but right now I am not convinced that
good function names are sufficient to completely document what is going
on in a function. I am convinced that good function names combined
with good named objects are adequate.
> Most problems simply can't be solved with a predetermined set of
> static data structures. So one can't have a meaningful name for
> each of the objects in the system.
True. Most problems generally *can* be solved with a predetermined set
of classes of dynamic data objects whose generic relationships are
static within a product version. And classes and objects almost always
have meaningful names in the real world. These should show up in the
code repeatedly like a litany to remind the developer/maintainer what
he is attempting to do.
------------
Another problem I have with languages like LISP is that I lose my way
in any nesting of a single kind of lexical relationship more than three
or four levels deep. In LISP, there is only one kind of lexical
relationship, so most functions tend toward at least four to six
levels. I find this bewildering.
In most other languages, different kinds of canonical structures are
lexically differentiated. Also, there is usually a clear lexical
distinction between what is supplied by the language and what is a part
of the application being written. These make it much easier to
mentally/visually "filter out" matters of interest from dull mechanism.
One passing thought to finish this off:
Software engineering is only necessary because all humans have simple
minds. We're real-time device interrupt engines with short stacks,
designed to chase prey to exhaustion, kill it, and eat. We are
incapable of holding at once in our minds anything complex. We are
easily distracted and forgetful. This is our evolutionary legacy. Good
tools should be directed toward making use of our creativity,
endurance, and intelligence, while discouraging us from depending on
those things for which we're unsuited. They are our means of parenting
ourselves.
Ralph
|
190.25 | escaping the dialectic | RAINBO::PRAETORIUS | R4T P8S | Thu Jul 20 1989 14:07 | 8 |
| re: creating languages vs. creating compilers
Wrong argument. The real question is, why aren't more people involved in
or excited by creating usable, enjoyable development environments (like the
kind you might find on a Symbolics workstation)?
randomly,
RP
|