The Fundamental Concept of Programming language X

David Gudeman gudeman at cs.arizona.edu
Thu Jan 4 08:54:43 AEST 1990


In article  <1782 at aipna.ed.ac.uk> sean at aipna.ed.ac.uk (Sean Matthews) writes:
>The Von Neumann machine, which is a sort of sophisticated Turing
>machine,

Hardly.  They are related in the same sense that the lambda calculus
is related to combinatory logic, or in the same way that grammers are
related to predicate logic.

>...It has produced languages such as Fortran, Algol{58,60,68,W}, COBOL,
>C, PL/1 Sumula, Pascal, Modula{2}, C, BCPL, Ada, Mesa, BASIC etc.
>
>And these languages have had such an influence on computer programming
>that most books on programming language design deal only with what to
>me seem the minor differences between them.

True enough...

>I think that this model has no theoretical virtue, all it offers is
>real world efficiency.

I disagree with this.  Algorithms were written long before there were
computers to execute them.  Part of the problem with this view is that
you are not distinguishing between the trivial models of algorithmic
computing (Turing machines and Von Neumann machines) and the
fundamental concept of an algorithm, which I define as: "a description
of a problem solution by systematic changes in a computing device,
based on rules of computation and on the current state of the device,
where the state of the device at termination encodes the problem
solution."  This view of computation is not only theoretically useful,
it is of great practical importance in finding solutions to many
problems that are conveniently solved by such a method.

In fact this method of computation is so convenient and intuitive,
that is is often used as a way of understanding programs written in
other models of computation.  For example, the lambda calculus is
often treated as a rewrite system, and I claim that rewrite systems
are essentially algorithmic in nature.  The same may be said of
grammers, and even predicate logic to a certain extent.

>So what are the advantages that Lambda calculus gives us?  Higher
>order functions,

Algorithmic systems can have higher-order algorithms.

>powerful typing systems,

completely unrelated to the functional/algorithmic difference.

>no side effects (except when we want them)

No side effects _ever_, even when we want them.  Unless you are giving
up the purity of the paradigm.

>compact code

more related to the data abstractions available than to computational model.

> more abstraction.

depends on your definition of abstraction.  If you define "more
abstract" as "less algorithmic", then this is true.

>The elimination of the Von Neumann bottleneck

This presumes that the Von Neumann machine is the basis of all
algorithmic languagess.  In particular it presumes that algorithmic
methods are limited to computation on integers, and that structured
objects must be handled one element at a time.  This is far from true,
and many of the languages you listed as Von Neumann languages present
facilities for treating aggregates at any desired level.  What they
frequently don't have is _primitives_ for treating aggregates as a
whole, but these can be (and usually are) supplied in libraries.

A very large percentage of the criticisms of "convential languages"
made by proponents of other systems are really criticisms of the data
types and primitive operations available in the languages.  For
example, one often sees FP proponents claim that the array data
structure is too limited because of its word-at-a-time operations.
But this criticism (which I agree with) doesn't imply that we should
switch to a functional language with sequences, it implies that we
should introduce sequences into the languages we prefer.
-- 
					David Gudeman
Department of Computer Science
The University of Arizona        gudeman at cs.arizona.edu
Tucson, AZ 85721                 noao!arizona!gudeman



More information about the Comp.lang.c mailing list