Memory Models

Steve Summit scs at adam.pika.mit.edu
Sat Aug 12 17:42:25 AEST 1989


In article <562 at dcscg1.UUCP> drezac at dcscg1.UUCP (Duane L. Rezac) writes:
>I am just getting into C and have a question on Memory Models. I have not
>seen a clear explanation on just what they are and how to determine which 
>one to use. Does anyone have a short, clear explanation of these for  
>someone just starting out in C?

The short answer is, stay as far away from this disgusting
concept as possible.  Segments buy nothing but trouble.

In article <5653 at ficc.uu.net> peter at ficc.uu.net (Peter da Silva) writes:
>Always use the smallest model you can get away with...
>The massive performance advantage of small model over large is a
>strong reason...

"Massive"?  I've never noticed any difference at all.  (This is
not to say that there is none, only that it is not noticeable in
the programs I write.  I am sure that some difference can be
demonstrated, but I believe neither that the far pointer overhead
is unacceptable, nor that the overhead is inherent in 32-bit
addressing -- that is, the problem is with the segment/offset
distinction and the gory code it forces compilers to generate.
A sensible, "flat" 32-bit address space could certainly be
implemented with little more overhead than the 16-bit near
addressing modes.)

I find segments to be significantly deleterious to _my_
performance.  Just today a program began misbehaving; it turned
out that a few utility object files it was importing from another
source directory were of the wrong model.  (Since the last time
I'd built this program, one source directory's makefile had been
changed to use large model, but the other hadn't.)

The Intel and Microsoft object file formats and linkers only make
a bad idea even worse: there are no explicit warning or error
messages when object files compiled with different memory models
are linked together, although a program so linked is virtually
guaranteed not to work.  If you're lucky you'll get a "fixup
overflow" or some such error at link time (not exactly
descriptive, but better than nothing.)  More likely, though, the
link completes silently but the near/far subroutine call mismatch
causes the machine to lock up on the first function call.  Since
a lockup is also the symptom of seemingly every other simple
programmer error on this benighted architecture, memory model
mismatch isn't always the first thing I look for.

Ever since I started having these object file incompatibility
problems, I've determined that I ought to just use large model
everywhere, since it's the most universal.  (I've got lots of
different directories sharing object and library modules; while
some programs need large model, none need small.  I like to think
that sharing object code is a good idea, preferable both to
reinventing the wheel and to maintaining multiple, potentially
inconsistent copies of source or object files in multiple
directories.  The presence of multiple memory models, however,
actively discourages such sharing.)  If I had the time and the
courage I'd convert every makefile in sight, but there's no
telling how much I'd break in the interim.

(Another horrific problem has to do with binary data files.  If
you write out a structure that happens to contain a pointer, you
can't read it in with a program compiled with a different memory
model.  I always knew binary data files were a bad idea, but I
used to think they only compromised transportability between
machines of different architectures, not different programs on
the same machine.  Now I have several thousand data files,
written by an old, small-model program, which can't be read by
new, fancier programs which want to be built large-model.)

In article <20728 at uflorida.cis.ufl.EDU> brs at beach.cis.ufl.edu (Ray Seyfart) writes:
>There is one significant reason to choose the small memory model if
>it is sufficient:   pointers will not point outside the program's
>address space.  This is important in MS/DOS, since there is no
>memory protection.

An interesting observation, which may have some merit, although
I've been crashing my PC daily for as long as I've been using it
(due primarily to the lack of memory or any other protection),
and I don't always use large model, so using small model is not
sufficient if you want to avoid baffling problems.

                                            Steve Summit
                                            scs at adam.pika.mit.edu



More information about the Comp.lang.c mailing list