compressed news on a 7300 with 512K RAM

David H. Brierley dave at galaxia.Newport.RI.US
Mon Jul 3 11:58:43 AEST 1989


In article <2464 at ditka.UUCP> kls at ditka.UUCP (Karl Swartz) writes:
>Until you can get more memory, try using the -b12 option on compress.  It
>will then use smaller tables and thus less memory.  Offhand I can't recall
>if it's enough less to keep from thrashing in 512K, but it should be a lot
>better.  Make sure your newsfeed generates 12 bit compress batches too.

If you want to see an incredible increase in processing speed, rebuild the
compress program so that it is only capable of doing a 12 bit compression.
I did this on my machine and instead of taking almost a minute to compress
a batch file it now only takes 8 seconds.  I also speeded things up by
making a fake rnews program that uncompressed the incoming batch and stored
it in the .rnews directory and later have the real rnews run from cron.  The
primary reason for all of the thrashing when you are doing an uncompress or
a compress -b12 is the size of the executable.  By building a special version
of compress with smaller tables you make the executable smaller and thus are
less likely to require swapping.  The fake rnews that I built had the code
for doing an uncompress without having all the huge tables required for doing
a compress and was thus much smaller.  The fake rnews is almost as fast as
doing a copy, even when your news feed sends a batch that has been compressed
with the full 16 bit compression.  I have 1 meg of memory so I am a little
better off than the person who only has 512K but I am still concerned about
squeezing every last ounce of power out of this machine.

If anyone is interested in the fake rnews program let me know and I will send
you a copy.
-- 
David H. Brierley
Home: dave at galaxia.Newport.RI.US   {rayssd,xanth,lazlo,mirror}!galaxia!dave
Work: dhb at rayssd.ray.com           {sun,decuac,gatech,necntc,ukma}!rayssd!dhb



More information about the Unix-pc.general mailing list