BIT Counting: C programming problem

Clayton Cramer cramer at optilink.UUCP
Fri Jan 13 05:46:57 AEST 1989


In article <35329 at think.UUCP., barmar at think.COM (Barry Margolin) writes:
. I haven't seen anyone so far suggest the far end of the time/space
. trade-off: have a 64K table, with each element N containing the number
. of bits in N (you'll probably have to use one of the slower mechanisms
. to compute the contents of this table, but it only needs to be run
. once when writing the program and then incorporated into the program
. text as an initial value).  Someone else came close by suggesting a
. 256-element table which would be consulted twice and then the results
. added.  But is an extra 64Kbytes of data much of a problem in these
. days of large and/or virtual memories?

Someone must own stock in DRAM companies.

To be fair, my first summer programming jobs involved having to
sort a randomly entered sequence of part numbers, and two data values
associated with each part number.  (Part numbers were hex -- my boss
hadn't figured out how to unpack binary to decimal in assembler
language -- that's why he hired me).  Since I didn't know how many
records I would get, and thus didn't know how much memory to allocate,
I created a file with four byte records (initialized to -1), used 
the part numbers as a random access key, and wrote the four bytes of 
data as records.  Then, when I was finished, I reopened the file for 
sequential access, and read through the file.

At the time, I thought, "What a neat way to sort all these records".
Of course, I was only 16, so I suppose it was forgiveable. :-)
-- 
Clayton E. Cramer
{pyramid,pixar,tekbspa}!optilink!cramer
Disclaimer?  You must be kidding!  No company would hold opinions like mine!



More information about the Comp.lang.c mailing list