T.R | Title | User | Personal Name | Date | Lines |
---|
126.1 | | QUARK::LIONEL | Three rights make a left | Thu Jan 22 1987 15:08 | 20 |
| Nobody I know of even tries to pretend that the Ada validation suite
ensures a "bug free" compiler. The validation suite only tries
to see if the implementation conforms to the written standard as
far as what features must be supported and how.
Also, I feel that anyone who believes that Ada is any "safer" just
because of the validation suite is living in a fantasy land.
Nevertheless, I believe that the relative "safety" of Ada lies not
in the supposed lack of bugs in the implementation (and as an Ada
developer, I can assure you that all implementations have bugs),
but that it is easier to develop correct software in Ada than in
most other languages. By correct I mean that the program does what
the designer intended, ignoring bugs in the compiler/RTL/OS. And
I agree with this philosophy.
However, I wouldn't dream of suggesting that a 100% correct SDI
implementation is possible, no matter what language is used. This
should not stop us from using Ada for all its advantages.
Steve
|
126.2 | Nothing new under the sun | NOBUGS::AMARTIN | Alan H. Martin | Thu Jan 22 1987 15:50 | 19 |
| Re .0:
Detecting arbitrary programs with aliasing bugs or unstable computations
in a mechanical fashion is no easier than detecting programs that may
contain an infinite loop - it is computationally undecidable. This
is true for most languages (almost all of the popular ones, I'm sure).
You should no more expect a compiler for any sufficiently general language
to contain a complete and correct "order dependency finder" than you
should expect it to contain an "infinite loop detector". (Either are
as likely as discovering that this year's Fords are powered by a perpetual
motion machine instead of a gasoline powered internal combustion engine).
On the other hand, a program can be shown to be free of order dependencies
by a proof, just as it can be proved to halt. Whether this is actually
done for a program depends on whether it has a sufficiently important job
to do (for instance, one which affects human lives), and what the cost
of generating and maintaining the proofs is.
/AHM
|
126.3 | | CHOVAX::YOUNG | Back from the Shadows Again, | Thu Jan 22 1987 22:56 | 13 |
| Re .2:
Hmmm, I have never heard of the impossiblitiy of detecting aliasing
bugs or unstable computations deterministicaly, Alan. Could you
point me to some references on this?
Also, as I have stated before in this conference, and in the MATH
conference the halting problem is only known to be impossible for
Turing-Machine equivilant languages. There are in fact useful
languages for which it IS possible to write an infinite loop detection
program.
-- Barry
|
126.4 | | SMOP::GLOSSOP | Kent Glossop | Fri Jan 23 1987 01:01 | 74 |
| Please note that order of evaluation is only one of the many things that
can change in a compiler from release to release that could potentially
make a "working" program that violates a language standard fail. Other
examples include (but are certainly not limited to):
- The allocation of variables. This includes allocating an
uninitialized variable to a location that happens to contain
a different initial value, generating code that leaves a
different initial value in the location, and causing addressing
to fail because the address of the allocation does not meet
previous, but not specified, criteria. (For example, the
variable is used in a hardware instruction that requires
quadword alignment, but the alignment is not specified in
the source program.)
- The behavior of unspecified language features. For example, a
language that is not required to do bounds checking might allow
out-of-bounds intermediate results without overflowing in one
release but not in another. (Consider PL/I. PL/I allows more
precision to be used by the implementation than what was specified.
If a programmer declares a number to be FIXED BINARY(3) - i.e.
3 binary digits of precision - the implementation is free to
use a byte to represent the value.)
Note that even the order of evaluation is much more pervasive that you
might expect. You would have to specify that all parameters to procedures
be evaluated in a given order, you would have to specify that all
expressions containing arithmetic operators that could potentially
overflow be evaluated in a specific order.
For example, consider the "simple" expression A + B + C, where all of
the variables are in the range minint..maxint, and the result of the
expression must be in that range as well. Given values maxint/2,
maxint/2 and -2, the expression will cause an overflow if the code
is generated as (A+B)+C, but not if it is generated as A+(B+C).
I'll also point out that as the complexity of the language increases,
the ability to do error detection at compile time can drop dramatically.
Languages that allow essentially unrestricted aliasing (C comes to
mind) allow the programmer to play all sorts of dirty tricks in a
program that is "standard conforming". Even relatively innocuous
things can present a problem. For example, PL/I allows arrays to
have run-time extents. This makes it impossible for the compiler
to even make educated guesses about whether or not a particular
element of an array is initialize. (For languages that only allow
compile-time extents, that don't allow variables to have their
address taken, and than have other simplifying restrictions, the
compiler can at least provide a close approximation to detecting
some classes of uninitialized variables.)
There are far more examples that could be given of things that fall
into this category of problems (not to mention plain old semantic
errors in the source program). As was pointed out before, it simply
is not possible to determine all errors of this type at compile time
except for certain VERY restricted classes of languages. It really
seems to me that this is overkill in any case. Languages like C
are popular to a certain degree just because they DON'T have the
certain classes of restrictions.
It seems to me that effort would be much better expended in trying to
determine the best way of minimizing semantic errors in program source
code, since those errors are several orders of magnitude more frequent.
(You would probably make the code compiled by compilers more reliable
by coming up with methods to improve program reliability and applying
those to the compiler than by spending effort coming up with means of
detecting and correcting the version to version and implementation to
implementation compiler differences. Compiler implementors do have the
option of not changing the generated code from release to release. In
fact, DEC has been somewhat unusual in that the compilers do generally
improve from release to release. If you used something like IBM OS PL/I,
you'd never have to worry about the generated code changing. On the
other hand, you'd never have to "worry" about your code going faster
without you doing anything or getting support for nice new tools like
LSE and SCA... ;-) The price of progress...)
|
126.5 | | SMOP::GLOSSOP | Kent Glossop | Fri Jan 23 1987 01:07 | 4 |
| One last note. There IS one class of programming languages that aren't
normally very sensitive to the order of evaluation, since it is typically
explicitly controlled - assemblers. Of course it does have a few other
minor drawbacks when you're trying to write reliable code... ;-)
|
126.6 | Proof by example; utility; unstable CTCEs | NOBUGS::AMARTIN | Alan H. Martin | Fri Jan 23 1987 08:52 | 41 |
| Re .3:
The assertion seemed obvious. For instance:
PROGRAM ALIAS
INTEGER I,A(2)
COMMON A
DATA A/0,0/
ACCEPT *,I
CALL SET(A(I))
IF (A(2).EQ.0) THEN
TYPE *,'Zero'
ELSE
TYPE *,'Nonzero'
END IF
END
SUBROUTINE SET(X)
INTEGER X,Y(2)
COMMON Y
X = 1
Y(1) = 1
END
Does the above program contain an illegal alias between the formal X and
the variable A(1) (Y(1)) in common?
Note that I was careful to avoid stating that all languages contained
programs which were unprovably incorrect.
How useful are those "useful" are those languages which loops can be
detected in? Has anyone ever sold a program for profit written in one of
them, for instance?
Re .5:
I suspect that most useful assemblers are subject to unstable arithmetic
problems in compile time constant expressions.
/AHM/THX
|
126.7 | More Proof by Example | TLE::RMEYERS | Randy Meyers | Fri Jan 23 1987 16:18 | 13 |
| Re .3 and .6:
It is true that there are languages for which it is possible to solve
the halting problem. Those languages fall into to classes. Either
those languages are less powerful than a Turing Machine or they can
be proved not to be run on a computer (that they can not the transformed
in to the notation of a Turning Machine).
The languages of the second class are fairly useless. The languages
of the first class, although limited, have their applications, and people
do seem to make money selling programs in them. For example, look at
any of the books that give prepared spreadsheet models for common
business applications.
|
126.8 | End Digression | TLE::RMEYERS | Randy Meyers | Fri Jan 23 1987 16:20 | 2 |
| And back to the subject, do you really want SDI software written in
Lotus 1-2-3?
|
126.9 | | PSW::WINALSKI | Paul S. Winalski | Sat Jan 24 1987 13:28 | 7 |
| I think that too many people are hiding behind questions like, "Do you REALLY
want an SDI coded in ADA???"
Why don't you come out and say what you really mean: "Do you really want an
SDI coded, PERIOD???"
--PSW
|
126.10 | Two in one | THE780::PEIRCE | Michael Peirce, Santa Clara FAC | Sat Jan 24 1987 16:04 | 11 |
| A agree with .9 there are two questions here wrapped in one:
(1) Should we build the type of systems called for in the SDI with
the state of the art in software engineering. i.e. should we
bet our defense/freedom/lives on it working as expected?
(2) Given that we are building some the SDI type systems, what
language should they be coded in, be that Ada, Fortran, C,
Prolog, whatever?
|
126.11 | | QUARK::LIONEL | Three rights make a left | Sat Jan 24 1987 21:29 | 15 |
| Re: .10, point 2
Given that some SDI software will be built anyway, I'd prefer that
A) It be built in the most appropriate language, and Ada is
perfectly appropriate for much of it
B) It should be built using DEC products wherever appropriate.
If it's going to be built, they may as well use the best
basis for it. This is why I don't lose sleep over my
work on VAXELN Ada - I'll just make it the best damned
Ada product for embedded systems there is and urge
everyone to use it.
Steve
|
126.12 | There are other relevant realms | MODEL::YARBROUGH | | Mon Jan 26 1987 12:11 | 22 |
| Whether we are discussing SDI specifically is irrelevant. For the most
part, the same issues pervade if the universe of discourse is Air Traffic
Control systems, or any other space in which human lives are at stake.
Since, I believe, we agree that *perfect* implementations of such software
are not possible (although such a lofty goal MUST be kept before us), the
questions boil down to what are acceptable costs for achieving a reasonable
probability of success.
What we do not have at present, I think, is enough data to determine what
the probabilities of success of very large systems are, and how much it
costs to change those figures significantly. The best data we have is
qualitative, not quantitative: we know that certain methods are better than
others, but we can't say how much yet. The best data I have is that yes,
ADA is better for doing this sort of thing than any other relevant
language, but how much it costs to make a reasonable SDI/ATC/etc.
implementation is beyond knowing now.
(Flame on)
The tragedy of recent strategic weapons systems is that their cost may have
been almost entirely wasted, as they have never been, and are likely never
to be, used.
(Flame off)
|
126.13 | SDI must be lower risk than ATC, etc., right? | NOBUGS::AMARTIN | Alan H. Martin | Mon Jan 26 1987 13:31 | 6 |
| Re .11:
I find it ironic that Digital has restrictions on the sale of systems for
direct control of nuclear reactors and air traffic control, yet doesn't
seem to have similar rules in a domain where a mistake could cause WW III.
/AHM
|
126.14 | Comments on a light note... | MLOKAI::MACK | Embrace No Contradictions | Mon Jan 26 1987 14:09 | 18 |
| Re .12:
> The tragedy of recent strategic weapons systems is that their cost may
> have been almost entirely wasted, as they have never been, and are
> likely never to be, used.
Gee, as I understood it, the goal was that they not be used, and the
justification of the cost was that, if we spend enough on them, we can
build one fancy enough that they never will be used.
Re .13:
Yes, but then presumably, after a reactor blows somebody can sue
you. You don't have the same problem if everything blows up. :-)
Strange mood today,
Ralph
|
126.15 | Ada(R) is a Program, not a Language\ | TLE::BRETT | | Mon Jan 26 1987 14:43 | 8 |
| The other misunderstanding being continued by .0 is that the Ada
language, per se, is the main thrust of the Ada Programming effort.
It is not. The effort is a wholesale overhaul of all aspects of
s/w engineering, including (but not limited to) editting environments
(all the way from design-editting to test-creation), file systems,
debugging tools, security issues...
/Bevin
|
126.16 | The Ada Environment | TLE::RMEYERS | Randy Meyers | Mon Jan 26 1987 18:44 | 23 |
| Re .15:
The mistake is a natural one. The Reference Manual does define Ada as
a programming language, not as a software engineering environment with
a programming language embedded within it. In current usage, the word
Ada, I believe, is still considered to be the language, not the
environment. One hears the phrases "Ada" (for the language) and "Ada
environments" more often the the phrases "Ada language" and "Ada" (for
the environment).
I think that we can both agree that when the DoD says that they want
something written in Ada, they also presume that an Ada environment
will be used as well. Ada brings with it a way of life...
Back to the main topic:
If SDI is built, I would want them to use Ada (the language) for the
software, so that there is a better chance that the software works.
On top of that, I would prefer that an Ada environment be used so that
the chances are even better that the software works and so that a few
billion dollars be saved developing the software.
The above is an endorsement of Ada, not SDI.
|
126.17 | issues are consistency and maintenance | OMEGA::CLARK | | Tue Jan 27 1987 11:50 | 30 |
|
Okay, as far as programming environments go, the ADA environment is the
hands down winner (in terms of maximizing the chances of producing a bug-
free system). The question I was trying to raise concerned the language
itself. In many respects, the language itself also is better as a tool
for engineering reliable systems (compared with other popular languages).
But, there is one dimension of reliability where it falls short:
With some languages, if you've tested module X, and you are
confident that it works as compiled by version V1.5 of a compiler,
you can be very sure that it will work as compiled by version V1.6
of the compiler. With ADA, and similar languages, where each com-
piler is free to treat order dependencies (and related software
abnormalities) in a different way, you cannot be quite as certain
that a module will continue to be "known good" going from one
release of a compiler to the next.
Since ADA does make forward progress in terms of reliability in so many
other dimensions, why should it move backwards on this dimension of consis-
tency from one release to the next? If the cost of fixing order of eval
is only a few machine cycles here and there, why shouldn't ADA do this,
in the interest of boosting reliability?
Are we going to put a clause in each software sale that says "if you have
used previous releases of this compiler, we recommend that before deploy-
ment of applications based on the new release, you retest all components
generated by this compiler?"
pac
|
126.18 | I've seen them already | TLE::REAGAN | John R. Reagan | Tue Jan 27 1987 12:40 | 4 |
| The clause you mention is already in most if not all contracts for
government software. (At least the ones I've had dealings with...)
-John
|
126.19 | Star Wars computing presentation in Chelmsford | SMURF::JMARTIN | DDT: the sonic screwdriver of TOPS | Tue Apr 28 1987 09:33 | 58 |
| Free admission Wednesday, April 29, 7:30pm
Question and Answer Session Old Town Hall at the center
RELIABILITY AND RISK: COMPUTERS AND NUCLEAR WAR
A 34-minute slide/tape presentation
Reliablity and Risk...
...investigates whether computer errors in key military systems--some of
them unpreventable errors--could trigger an inadvertant nuclear war.
...features technical, political, and military experts discussing the role
of computers at the heart of civilian and military systems, from the space
shuttle to nuclear weapons to Star Wars;
...describes the ways in which all large, complex computer systems make
mistakes--often unexpected and unpreventable mistakes:
o The 46-cent computer chip failure that led to a high-priority
military alert.
o The software error that led to the destruction of the first
Venus probe.
o The design flaw that caused a missile early-warning computer to
mistake the rising moon for a fleet of Soviet missiles.
...explores the growing reliance on computerized decision making and how a
computer error could trigger a disaster, especially in a time of crisis.
...explains why we should not rely exclusively on computers to make
critical, life-and-death decisions.
...uses straightforward language and graphics and is recommended for all
audiences. No technical knowledge is required.
...received a Gold Award in the Association for Multi Image New England
competition in November, 1986--the largest multi-image competition in the
country.
Speakers in Reliability and Risk include:
o Lt. General James A. Abrahamson, Director, Strategic Defense Initiative
Organization (SDIO)
o Lt. Col. Robert Bowman, Ph.D., US Air Force (retired), Former Director,
Advanced Space Programs Development
o Dr. Robert S. Cooper, Former Director, Defense Advanced Research Projects
Agency (DARPA)
o Dr. Arthur Macy Cox, Advisor to President Carter, SALT II Negotiations, and
Director, American Committee on U.S.-Soviet Relations
o Admiral Noel Gaylor (retired), former Commander-in-Chief of the Pacific Fleet
o Dr. James Ionson, Director, SDIO Office of Innovative Science and Technology
o Severo Ornstein, Computer Scientist (retired) and Founder, Computer
Professionals for Social Responsibility
o Professor David Parnas, Computer Scientist, Resigned from SDIO Panel on
Computing in Support of Battle Management
o Dr. John Pike, Associate Director, Federation of American Scientists
o Dr. William Ury, Director, Harvard Nuclear Negotiation Project
o Actress Lee Grant as narrator
and many others
Reliability and Risk was produced by Interlock Media Associates and CPSR/Boston
|