The Fundamental Concept of Programming language X

Sean Matthews sean at aipna.ed.ac.uk
Wed Jan 3 04:29:49 AEST 1990


In article <1470 at mdbs.UUCP> wsmith at mdbs.UUCP (Bill Smith) writes:
>The idea is "fundamental concept."  The fundamental concept of
>a language is the data structure, control structure or style issue
>that distinguishes the language and must be understood in detail before 
>proficient use of the programming language can begin.
>
[List that to me seems far from fundamental]

The notion of what is a fundamental concept of a programming language
is interesting, but it seems to me that what has been attributed to a
programming language as `fundamental' here is wrong; being far too
concerned with the superficialities.

Take first some of the the sorts of models that we have for computers:

1. the Von Neumann stored program/ processor machine.

2. the lambda calculus.

2a. combinatory logic

3. communicating processes of various sorts.

4. proof search (e.g. sequent calculus)

and consider them in turn.

The Von Neumann machine, which is a sort of sophisticated Turing
machine, comes to us in all sorts of forms, and has had a decisive
influence (necessarily, probably) on the way computers have been built
up to now, and (unfortunately, definitely) on the way computer
scientists have been educated up to now.

It has produced languages such as Fortran, Algol{58,60,68,W}, COBOL,
C, PL/1 Sumula, Pascal, Modula{2}, C, BCPL, Ada, Mesa, BASIC etc.

And these languages have had such an influence on computer programming
that most books on programming language design deal only with what to
me seem the minor differences between them.  The similarities between
them all are such that to be able to program in one is to be able to
program in all of them.  One simply decides what is needed for a
particular task.  C is a relatively low level language, that is suited
to certain types of system programming, Fortran is good for bashing
numbers, but they are all essentially the same.

One could make a secondary distinction between the languages that are
stack based (e.g. C) , and those that are not (e.g. Cobol), the
former would then have a closer afinity with functional programming
languages (see below).  Or one could distinguish between those that
have some form of dynamic storage (e.g. Pascal) and those that do not
(e.g. Fortran).

Ok, so I exagerate a bit, a typical COBOL or BASIC programmer is going
to have more difficulty coming to terms with Algol-68 or C than the
other way around, but even an Algol-68 programmer is parochial---he
still thinks in terms of a load and store architecture with state and
sequential operations and procedures.

I think that this model has no theoretical virtue, all it offers is
real world efficiency.  In evidence, I would say that it shows in the
`formal' descriptions that I see in computer text books of algorithms,
which amount to writing out the algorithm in pseudo PL/1, and which
make me want to fling the book against a wall.  These are not what I
would call formal specifications, since they hopelessly confuse
implementation with design.  This does not condem the design, just
that I think programmers identify it with what computers are, instead
of recognising that it is a subset (and an expressively weak subset).

The second model that I have listed is the Lambda calculus. Which is a
much nicer theoretical (and practical) model than load and store.  I
have written real programs in what amounted to pure lambda calculus,
and I would rather use it than Algol-68 for most things (but that is
personal prejudice).  What languages use it?  Simple functional
programing languages, including some forms of lisp, such as Lispkit
lisp (is that right, if Peter Henderson is on the net he can correct
me).  Pure Lisp was a partial stab at it but it has dynamic scoping,
which makes a mockery of any pretentions it has to pure functional
language-hood.  One of the interesting things about lisp is how the
load and store mindset was so entrenched, even in the early sixties
that the people who developed it immediately started shift it from a
lambda calculus foundation, which it could have been settled on
properly after a slightly unsteady start, to being like more like
Fortran.  And to head off any criticism, yes I do realise that
something had to be done to Lisp to make it practical on the machines
of the time, but did they have to be *quite* so violent.

One interesting side effect of adding imperative structures (a.k.a.
assignment) to Lisp while trying to fix the problem of dynamic
scoping, was to produce the idea of closures, which appears in Scheme,
and can be seen in retrospect in Smalltalk (which you could think of
as going in the other direction, frantically chasing *some* of the
virtues of functional programming systems from an Algol derived base).
Of course the actual motivation behind Smalltalk was very different,
but it gives a good place to file the ideas that underlie it; it would
be a mistake to think that objects are an entirely novel concept.

Further out, there is the typed lambda calculus, which allows much
stronger, and flexible, typing systems than what passes for them in
Algol style languages.  (and then there are `type theories' which are
about as strongly typed as you can get and sacrifice nothing in the
process except bugs, and produce `programming languages' such as
NuPRL, but that is a discussion for another time).  You get this in
programming languages such as ML.

In fact, ML is an interesting case, since it started out with pure
typed lambda calulus and then added as much as was necessary of
imperative structures as was needed (it turns out to be very little),
and got a programming language, that, for complex tasks, does not need
to be much less efficient than C.

I have also listed a section 2a, of combinatory logic, which is where
we could put languages like FP, and even APL.  FP is clearly
combinatory, while I would argue that the way that APL works is much
more influenced by the combinatory style that it encourages than by
the fact that it works with arrays (and I know that there is
assignment in APL, but in practice people try to avoid it, and write
one liners instead, since that is the more powerful, and natural way
to think in APL, the pity is that it has such a limited set of
combinators).

So what are the advantages that Lambda calculus gives us?  Higher
order functions, powerful typing systems, no side effects (except when
we want them) compact code, more abstraction.  The elimination of the
Von Neumann bottleneck, both in the programmer, and in the machine; we
no longer have to think in terms of one thing at a time.

What about paradigm three?  Well here there is the notion of Hoare's
CSP (yes and Milner's CCS, but let us not complicate things) which is
best seen in occam (no capitals), which is a load store language that
has been heavily modified to remove the Von Neumann bottleneck
mentioned above.  In fact it is barely still a load store language,
since there is so much influence having data flowing actively around
the machine; going and getting itself processed, instead of waiting in
the queue.  A much less important implementation is in Ada, where the
tasking system is similar, but it is not obvious enough or well enough
integrated to affect the mindset of the programmer, being bolted onto
the side instead.

paradigm four is maybe the least obvious to the typical programmer.
Prolog is the most obvious implimentation of this sort of thing.  I am
not sure what we learn from it (I think it---i.e. prolog, and more
generally, logic programming---is overrated as a programming
technique) but it does at least give some good ideas about what can be
done with pattern matching and non-determinism, and it does give a new
set of intellectual tools with which to approach any problem.

Sorry if the above is a little scrappy, but it was typed in one go and
it is not going to be edited.

Anyway I think I got my point across.  Of course in practice, none of
these categorisations is complete, inf only because most of them are
`infected' with the idea of state from the Von Neumann machine---the
only ones that are not impress the theoreticians but are not much use
in practice.

the important thing is to realise that the real differences are not of
the sorts of data that programming languages deal with, but the way
that they allow you to approach what you are doing.

Think in terms of natural languages; (unfortunately) most programmers
are in the position of linguists who have studied Italian, and Spanish
and maybe a little German and generalise cheerfully from that
knowledge while having no idea that there are languages such as
Chinese or Arabic.  Think of the interesting ideas they would get from
exposure to all the products of those cultures; how might their ideas
of what is a narrative, or what is a poem, would be enlarged.

Sean



More information about the Comp.lang.c mailing list