Memory Models

Michael Davidson md at sco.COM
Tue Aug 15 11:15:20 AEST 1989


In article <10703 at smoke.BRL.MIL> gwyn at brl.arpa (Doug Gwyn) writes:
>In article <562 at dcscg1.UUCP> drezac at dcscg1.UUCP (Duane L. Rezac) writes:
>>I am just getting into C and have a question on Memory Models.
>
>That is not a C language issue.  It's kludgery introduced specifically
>in the IBM PC environment.  Unless you have a strong reason not to,
>just always use the large memory model.  (A strong reason would be
>compatibility with an existing object library, for example.)
Sorry, but it is an evil necessity brought about by the segmented
architecture of the INTEL 8086 and 80286 - although the most common
place that these processors show up is in the IBM PC environment
this kludgery follows these CPUs wherever they go.

Actually, better advice is to always use small model (ie up to
64k code and 64k data), unless you really don't care about performance.
The cost of continually reloading segment registers (which is what
large model will tend to do) is bad in real mode and horrific in
protected mode. Just remember that small is beautiful....



More information about the Comp.lang.c mailing list