Speed costs (Re: MWC's Coherent - A Lemon...)

Chip Salzenberg chip at tct.uucp
Fri May 25 23:15:17 AEST 1990


[[ Followups to comp.arch ]]

According to jca at pnet01.cts.com (John C. Archambeau):
>peter at ficc.ferranti.com (Peter da Silva) writes:
>>Did you know that C-news runs in small model?
>
>So what if C-News runs in small model.
>A vast majority of C compilers won't.

Competent C compilers can be written in small model.  I once worked on
a C compiler that ran on a PDP-11, which as everyone knows, is limited
to 64K of data under most (all?) Unix implementations.

The old saw that programs will expand to fill the memory available to
them is true.  It points out that the primary reason why mundane
programs use large memory spaces is the tendency of programmers to use
brute force to attack problems until the computer they're using runs
out of force.  It used to be that the brute force line was crossed
quite early; not so today.  Too bad.

I have in the past focussed almost exclusively on kernel bloat as the
Evil Memory Waster Of Our Time.  However, I now believe that I was
mistaken.  As much as the Unix kernel hackers have caused their baby
to grow in recent years, the utility programs and support code have
caused as much, if not more, bloat than the kernel.  There is plenty
of blame to go around.

As Henry Spencer has so often pointed out, thinking small seems to be
a lost art[*], which is a pity.  The X window system could use a small
thinker, possibly for the purpose of discarding X entirely.

[*] Were I a cynic, I might wonder if thought of any kind is in short
supply among today's programmers.  I might also cite Sturgeon's Law:
"Ninety percent of everything is crap."  However, as I am not a cynic,
I shall refrain.
-- 
Chip Salzenberg at ComDev/TCT   <chip%tct at ateng.com>, <uunet!ateng!tct!chip>



More information about the Comp.unix.xenix mailing list