help needed fread'ing data into a buffer using Turbo C++ an
Don_A_Corbitt at cup.portal.com
Don_A_Corbitt at cup.portal.com
Sat Sep 29 11:45:14 AEST 1990
>
> Hello Netlanders!
>
> As the subject says, I'm having some trouble reading data into an 80K buffer,
> or any buffer larger than 64K actually, using fread. I'm using the huge
> model. The code goes something like this...
>
> #include <all.the.necessary.includes>
>
> main()
> { unsigned char *buffer, *cptr;
Even in huge model, you must declare pointers to object larger than 64KB
as "unsigned char <<huge>> *buffer".
>
> buffer = (unsigned char *) farmalloc( 81920 );
>
> for (i=0, cptr=buffer; i<320; i++, cptr += 256 )
> { fread( (void *) buffer, sizeof( char ), (size_t) 256, infile );
I assume you are really reading into cptr, not buffer.
> printf("row: %d, farheap status: %d\n", i, farheapcheck( ));
> }
> }
>
> For the curious, there is an fseek before the fread to correctly
> position the file pointer in the original program.
>
> Everything works fine up until it reads in row 255, the piece of data which
> is crossing the segment boundary. Farheapcheck() then reports that the heap
> is corrupt. I was under the impression that since I'm using the huge model
> that ALL pointers are normalized and that data can occupy more than 64K. Is
> this a problem with fread? If I remember correctly, I didn't have this
> problem with Microsoft C 5.0. Any suggestions would be appreciated.
>
> Thanks in advance.
>
> -Alvis
My mailer couldn't recognize your address, so I post.
I recommend you post machine-specific questions to a machine-specific
group, such as comp.os.msdos.programmer, where you can probably find lots
of people who know about peculiarities of huge pointers and MS-DOS.
---
Don_A_Corbitt at cup.portal.com Not a spokesperson for CrystalGraphics, Inc.
Mail flames, post apologies. Support short .signatures, three lines max.
More information about the Comp.lang.c
mailing list