[Info-vax] Viable versus ideal programming languages

Dan Cross cross at spitfire.i.gajendra.net
Thu Mar 24 08:42:11 EDT 2022


In article <ja1e8sFlhdrU1 at mid.individual.net>,
Bill Gunshannon  <bill.gunshannon at gmail.com> wrote:
>On 3/23/22 15:24, Dan Cross wrote:
>> In article <ja113lFj1nhU1 at mid.individual.net>,
>> Bill Gunshannon  <bill.gunshannon at gmail.com> wrote:
>>> Define today's standards.  :-)
>> 
>> Language standards published any time this century.  :-)
>
>Language standards with what purpose?  :-)

Well, taking advantage of new hardware capabilities, better
understanding of the use and semantics of a language, really any
number of things.

For example, recent C and C++ standards define standard memory
models for the abstract virtual machine they describe, and
define standard interfaces for atomic operations, so that one
can write portable parallel code.

In some cases, old interfaces that are dangerous or ill
conceived are deprecated, c.f. `gets`.

COBOL people like to point out that the quip about a "65 year
old language" is inaccurate since more recent revisions of the
language have changed it to add useful modern functionality.
However, that's not useful if one can't get a toolchain that
supports those more recent standards.

>>> Certainly not ANSI but some of us don't really care.  K&R was
>>> good enough to develop one of the most prevalent OSes in use
>>> today.  What more is needed?
>> 
>> Unix systems of today don't use K&R C, and with good reason.  I
>> did a port of 7th Edition to the 68010 with a 68451 MMU a few
>> years ago, and updated the code to ISO C11.  Just adding
>> prototypes found bugs.
>
>Probably would have been found with lint if someone had bothered
>to try.

What makes you think they didn't?  The same group that liberated
`lint` from Johnson's compiler wrote 7th Edition, after all.
Lint doesn't catch all of these problems.

But moreover, why go to an extra step of running a separate tool
if the compiler will do it for you?  Prototypes and well-typed
formals in function definitions were a boon to both correctness
and readability, as it turns out.  I'd rather have compilation
fail due to an incompatible type mismatch than rely on everyone
changing code to run some external tool.

>But, take any of the packages available today (start
>with GNU stuff) turn on all the warnings and stand back and watch
>just how is ignored on their production versions.  Programmers
>seem to have this feeling that warnings (and even errors if you
>can convince the compiler to ignore them) are OK if you don't
>see them and the easiest way to do that is don't look.

Three points here.

First of all, what does that have to do with anything?  Poor
hygiene around warnings is orthogonal to the desire to run code
written to a recent language standard, or the benefits of those
recent standards.  (Which is not to say that they bring _only_
benefits; there _are_ missteps.)

Second, the topic is running compilers capable of compiling the
current version of the language.  The universe of interesting
software written in e.g. C is large; much of that requires at
least a C89 compiler; many more of it requires C99, C11 and C23
is on the horizon.  By restricting myself to a K&R compiler, I'm
restricting myself to an ever-smaller subset of that universe.
For example, I have an emulated VAX running 4.3BSD.  I wanted a
"modern" shell for that; I'm stuck with an ancient version of
tcsh from the previous century that could compile on that
ancient system.  Don't get me wrong, it's fun to noodle around
on, but I wouldn't consider that acceptable for serious work in
this day and age.

Third and finally, warnings themselves can be overly aggressive
or misleading.  For example, `gcc` has a warning if it detects
that one is calling `strncpy` in such a way that the entire
that the length argument is exactly the size of the destination
argument.  This is because people don't understand the semantics
of `strncpy` and often get it wrong in such a way that they may
not NUL-terminate the destination.  But the warning is in some
sense too aggressive, and precludes cases where either a) the
programmer _actually_ wants that behavior (for example, they may
be filling in a fixed-length field of some data structure that
might be written to a storage device, or copied over a network,
or otherwise used in some way where the notion of a "C string"
isn's appropriate or desired) or b) they may use some other
common idiom to ensure proper termination:

    char dst[128];
    strncpy(dst, src, sizeof(dst));
    dst[sizeof(dst) - 1] = '\0';

etc.  (Of course, the real solution here is to use a more
appropriate function; perhaps `strlcpy`:
https://www.sudo.ws/todd/papers/strlcpy.html).

Anyway, the point is, not all warnings are equally valuable.  In
the GCC `strncpy` case, the cure to silence the warning may be
worse than the disease.

>>> As for the other languages.  I run the same Fortran that was in
>>> use on everything from minis to mainframes.  Full Pascal on both
>>> the Z80 and 6809.  Same COBOL that ran on Primes, Univac 1100, RSX,
>>> RSTS, and even VMS.  And a version of BASIC on the 6809 that is
>>> far beyond the BASIC that came out on the PC years later. And APL
>>> by its very nature is a perfect fit for these systems. There were
>>> a number of other languages, PILOT, Smalltalk, Lisp etc. but they
>>> weren't really any more successful on bigger machines.
>>>
>>> I wonder how much of this notion that small systems aren't useful
>>> for anything but playing games contributed to companies like DEC
>>> missing the boat when the micro world came along.
>> 
>> You can certainly do all kinds of useful stuff on small systems,
>> but in this day and age it begs the question: why?  Aside from
>> an interesting academic exercise, I don't much see the point
>> of a hosted environment on a z80 or 6809, particularly when a
>> Raspberry Pi Zero costs $5 or something and gives you so much
>> more.  Use those devices as a _target_ platform for something?
>> Sure.  But a host?  Why bother?
>
>Because I already have them.  Because I like nostalgia.  Other
>reasons I'm sure.

So, an interesting academic exercise.  :-)

>Oh, and I use the Pi and Arduinos as well.
>But they just aren't as much fun.

That's fair.  I run a simulated CDC cyber 6600, a bunch of pr1me
stuff, a couple of VAXen and PDP-11's running Unix, TOPS-20,
Multics, Plan 9 etc.  It's more fun than a cookie-cutter
Raspberry Pi running Linux.  But it's also important to be aware
of the limitations of those things; I do that for fun, but for
serious work I look to modernity for practical reasons.

>Lately I have been thinking
>about tasking another look at Amoeba.  Imagine what could be
>done with an Amoeba cluster of Pi's.  :-)

In seriousness?  Amoeba was an interesting research system 30
years ago.  These days, I'd think a cluster of Pi's running Plan
9 would be more interesting (note that Sape Mullender went to
Bell Labs and worked on plan9 after working on Amoeba).

	- Dan C.




More information about the Info-vax mailing list