search

Bill Rust wjr at ftp.COM
Sat Aug 19 01:02:27 AEST 1989


In article <184 at stsim.UUCP> glenn at stsim.UUCP (Glenn Ford) writes:
>I am writing a small project (not related to work, personal) that has to
>do search's on a sorted database.  I search off the keyword which is string
>of N length and is unique to all other keys.  For right now I do a binary
>search off the database, but this tends to be slow when run off a floppy
>drive (yes, it has to run on a floppy drive in SOME cases).  My question
>is: Is there a faster, yet fairly easy to implement, routine that can
>search my database?  Faster than binary search..  I don't want B-Tree's.  My
>database is allready built.


First, I am not sure why you don't want B-Trees (or some variant),
since they are pretty easy to implement and quite fast. You don't
mention how big the database is or what percentage of the record a key
takes up. If the key is a small part of record, making a separate key
file that gets read into your application in one piece, will really
speed up access into the database, since the keys are searched at RAM
speed not disk speed. Also, if the database is a fixed size, a hashing
function (directly converting the key to a record position) may be
most appropriate. If you want a really good discussion of all this,
consult Knuth. Just remember that you want to minimize the number of
read operations to disk. One possible longshot, since you are working
with floppies, is just read in the whole database. On a PC, you can
get upwards of 500K of data read in, with a small program, and, at
that point, your searching technique matters a lot less.

Bill Rust (wjr at ftp.com)



More information about the Comp.lang.c mailing list