help needed fread'ing data into a buffer using Turbo C++ and huge model

Alvis Harding Jr. adh at uafcseg.uucp
Wed Sep 26 12:25:16 AEST 1990


Hello Netlanders!

As the subject says, I'm having some trouble reading data into an 80K buffer, 
or any buffer larger than 64K actually, using fread.  I'm using the huge 
model.  The code goes something like this...

#include <all.the.necessary.includes>

main()
{ unsigned char *buffer, *cptr;
  int i;
  FILE *infile;

  infile = fopen( "afile.dat", "rb" );

  buffer = (unsigned char *) farmalloc( 81920 );

  for (i=0, cptr=buffer; i<320; i++, cptr += 256 )
     { fread( (void *) buffer, sizeof( char ), (size_t) 256, infile );
       printf("row: %d, farheap status: %d\n", i, farheapcheck( ));
     } 
}

For the curious, there is an fseek before the fread to correctly 
position the file pointer in the original program.

Everything works fine up until it reads in row 255, the piece of data which
is crossing the segment boundary.  Farheapcheck() then reports that the heap
is corrupt.  I was under the impression that since I'm using the huge model
that ALL pointers are normalized and that data can occupy more than 64K.  Is
this a problem with fread?  If I remember correctly, I didn't have this 
problem with Microsoft C 5.0.  Any suggestions would be appreciated.
       
Thanks in advance.

                                                  -Alvis



More information about the Comp.lang.c mailing list