unix question: files per directory

Operator root at helios.toronto.edu
Wed Apr 19 06:27:26 AEST 1989


In article <4822 at macom1.UUCP> rikki at macom1.UUCP (R. L. Welsh) writes:
>From article <24110 at beta.lanl.gov>, by dxxb at beta.lanl.gov (David W. Barts):
>> 
>> How many files can there be in a single UNIX directory
>
>You will undoubtedly run out of inodes before you reach any theoretical
>limit.  

Another thing you may run into is that some UNIX utilities seem to store
the names of all of the files somewhere before they do anything with them,
and if there are a lot of files in the directory, you won't be able to
run the utility on all of them at once. (This won't prevent you from creating
them, though). In particular I am thinking of "rm". When cleaning up after
installing the NAG library, I tried to "rm *" in the source code directory.
It refused (I think the error was "too many files"). I had to go through and 
"rm a*", "rm b*" etc. until it was down to a level that rm would accept. I 
found this surprising. In at least the case of wildcard matching, why wouldn't 
it just read each name from the directory file in sequence, comparing each for 
a match, and deleting it if it was? Having to buffer *all* the names builds in 
an inherent limit such as the one I ran into, unless one uses a linked list 
or some such.

Does anyone know:
     1. why "rm" does it this way, and
     2. are there other utilities similarly affected?

I don't know exactly how many files were in the directory, but it was many
hundreds.
-- 
 Ruth Milner          UUCP - {uunet,pyramid}!utai!helios.physics!sysruth
 Systems Manager      BITNET - sysruth at utorphys
 U. of Toronto        INTERNET - sysruth at helios.physics.utoronto.ca
  Physics/Astronomy/CITA Computing Consortium



More information about the Comp.unix.questions mailing list